# The RX 6000 series Owners' Club



## GamerGuy (Dec 19, 2020)

Well, since Ampere has a club thread here, I figured why not a club for RDNA2 aka RX 6000 series? I guess there won't be that many of us, but I wanna stand up and counted as a proud owner of a 6000 series card. To get things started, here's mine, a Sapphire Nitro+ RX 6800 which I'd gotten a while back.
Here's a shot of the card, next to the PC VEGA64 Red Devil it'd replaced...





Here's one inside my rig, and yes, RGB does improve OC'ing on GPU and CPU, improves my health, and promote world peace!





I've also preordered an XFX Speedster MERC319 RX 6900 XT and a PowerColor RX 6900 XT Red Devil LE, will keep the one that ships first.


----------



## turbogear (Dec 19, 2020)

Glad that you started this club.
Let me join you. 
I own 6800XT reference card that has been converted to water cooling. 

Here are some photos of my Radeon 6800XT. 
The original cooler was replaced by water block.

I am using EK water block.
EK Water block for RX 6800 and RX 6900 RDNA2 GPUs – EK Webshop (ekwb.com)




I have been working on finding a nice under volt setting for it.
I now have the Radeon Wattman set at 2600MHz @1015mV with the Power Limit slider at 15%.




No overclock of memory.

I don't have the Hardware to measure GPU power consumption by itself, but I did some approximations based on reading from my Corsair HX1200i power supply and subtracting consumption of other component while observing my system's consumption at idle and at GPU stress test.
I come up with ~294W for default 6800XT setting and with my above mentioned under volt I get ~337W which means around ~43W more power consumption.


Because of water rcooling, the GPU temperature is on average is 43°C (max 46°C ).
The junction hotspot average temperature is on average 63°C (max 67°C). 





Here are the results from Time Spy. The SAM is enabled.
My score at default 6800XT setting with SAM enabled is 17402 points.



That is about 9,4% higher score than the default settings.


Stability test with Firestrike Extreme:



Here are the settings that I am using:



The excel graph below show the frequency distribution over the entire Firestrike Extreme Stress test run:




In the next times I will run some tests with these setting when playing games like Borderland 3 and Cyberpunk.


----------



## GamerGuy (Dec 23, 2020)

Man, you can hear the crickets chirping!   Nvm, when I get the MERC319 RX 6900XT, I'd be making a bit more noise!


----------



## phill (Dec 23, 2020)

I'm here for the pics, hardware and info on these amazing new GPUs


----------



## GamerGuy (Jan 9, 2021)

Dangit, I started this thread and plum forgot about it!   As some may be aware, I've replaced my Sapphire Nitro+ RX 6800 with a Nitro+ RX 6900 XT, I'd have been just as happy with a 6800 XT but sadly, I can't get my mitts on one. And in my neck of the woods, a locally distributed RX 6800 XT is priced not that far off an RX 6900 XT from Amazon USA. I'd paid about 1335USD for my Nitro+ 6900 XT, prices are just nutz here in my neck of the woods. Anyway, on to the pics...





A comparison of the two cards, don't judge from the angle of the pic, both look very similar, same size from what I could see of them....the RX 6800 is in the foreground.





Installed in my rig and raring to go, man, I love gaming with this card! Gears of War 5 Hivebusters runs silky smooth at max setting @ 3840x1080, Serious Sam 4 runs just as smooth.....





I'm no benchmarker nor am I an OC'er, so my benchmark scores are more or less stock, with my CPU doing its auto-OC. I may have undervolted my GPU is one or two test though.









Will post my 3DMark Timespy and Firestrike Extreme scores a bit later....


----------



## GamerGuy (Feb 2, 2021)

C'mon, guys, surely there're more than two RX 6000 series owners here.....


----------



## wolf (Feb 2, 2021)

@lynx29 has a RX6800 I believe? Might not know about the club.

It's depressing how long these shortages have kept up, it should be such an awesome time for high end GFX, but no matter which flavour you want, they're bloody hard to buy, and if you can they're more often massively price inflated


----------



## GamerGuy (Feb 2, 2021)

wolf said:


> It's depressing how long these shortages have kept up, it should be such an awesome time for high end GFX, but no matter which flavour you want, they're bloody hard to buy, and if you can they're more often massively price inflated


QFT! 

I'd had to 'settle' for the RX 6900 XT simply because the more popular AIB RX 6800 XT's (from XFX, PowerColor, Sapphire) are nowhere to be found. Looking forward to more joining the club!


----------



## Space Lynx (Feb 2, 2021)

wolf said:


> @lynx29 has a RX6800 I believe? Might not know about the club.
> 
> It's depressing how long these shortages have kept up, it should be such an awesome time for high end GFX, but no matter which flavour you want, they're bloody hard to buy, and if you can they're more often massively price inflated



yep it is, I am super lucky I got a 5600x and rx 6800 non-xt on launch day... honestly still can't believe it lol  --- I think it will be this way (sold out) until late summer --- my OC on my non-xt matches stock xt across the board and is fully stable. so yeah im pretty happy considering I paid MSRP.

@GamerGuy Don't feel like installing sharex at the moment, but my 3d mark scores as of jan 23rd with beta drivers and latest mobo bios are  16,021 graphics score on time spy and 9,043 on cpu time spy.  pretty happy with all of it.  hoping this will be a 3-5 year, possibly longer build for me as I have a ton of backlog. so yeah.  also I use a strong fan curve - so i never break 65 celsius average temp even with OC, hotpsot sometimes hits 79 celsius but its rare and doesnt last long.


----------



## turbogear (Feb 2, 2021)

GamerGuy said:


> C'mon, guys, surely there're more than two RX 6000 series owners here.....


It seems that not many owners are interested in sharing their experiences here.
There is more activity at Guru3D 6800 owners thread where I am also a member than here.  

In any case, I am still tuning my 6800XT for more performance.
With the latest driver 21.2.1, I am getting ~19350 score in TimeSpy with 2650MHz@970mV with 15% powerlimit.
I am using MPT with GPU Power Limit set to 275W. So total amount of power draw GPU core can have is 316W taking into account 15% powerlimit in Wattman.

In Assassin's Creed Valhalla, I get GPU frequency in range 2580-2612MHz and power draw for GPU core only is in range 220-250W.


----------



## ratirt (Feb 4, 2021)

Joining the club with my 'little' thing. It is a hellish of a card worthy the name 'Red Devil' that's for sure.
I'll add more photos later on.



Adding some more stuff
The quality is not the best.




Test results. 2nd run to make it more accurate







Unigine test


----------



## Space Lynx (Feb 4, 2021)

that powercolor card cost a lot LOT more than my non-xt stock model, uses more power, and only gets 500 more points in time spy in graphics score. hmm. I don't get it. i mean i get that i have a mild OC on my card and that is your stock score, but still its a 6900 xt vs a 6800 non-xt... the difference shouldn't be that close.


----------



## BMfan80 (Feb 4, 2021)

I'll add mine to this thread since there isn't much going on here.
I bought myself a XFX 6800XT Merc early Jan. 1st AMD since I had a Sapphire Tri-X R9 290 OC.
It has one of the best coolers I've ever had on a gpu,I have a custom fan profile set and I've never seen the card hit 60 degrees. At 50 degrees it is set to only 30% fan speed and we are in the middle of summer here
It reaches over 30 most days and I don't have a air con,my 1080Ti was quite often over 60% fan speed.
Just ran a quick benchmark.


----------



## ratirt (Feb 4, 2021)

lynx29 said:


> that powercolor card cost a lot LOT more than my non-xt stock model, uses more power, and only gets 500 more points in time spy in graphics score. hmm. I don't get it. i mean i get that i have a mild OC on my card and that is your stock score, but still its a 6900 xt vs a 6800 non-xt... the difference shouldn't be that close.


Well i still have some tweaking to do  Besides my CPU is not the best one though. For games at 4k is quite capable.


----------



## Space Lynx (Feb 4, 2021)

ratirt said:


> Well i still have some tweaking to do  Besides my CPU is not the best one though. For games at 4k is quite capable.




oof just noticed your on 2700x, yikes. time to upgrade indeed. if you have a 6900xt and a 2700x, your doing it wrong mate. lol  i understand supply issues though so yeah take what you can get at the moment.


----------



## Athlonite (Feb 4, 2021)

I think I can join now that I have a Sapphire Nitro+ RX6800 OC 16GB card 


Sapphire Pulse RX5700 on top vs Sapphire Nitro+ RX6800 on the bottom



Had to remove a fan to fit it in luckily the fan will fit back in under the GPU


----------



## GamerGuy (Feb 5, 2021)

lynx29 said:


> oof just noticed your on 2700x, yikes. time to upgrade indeed. if you have a 6900xt and a 2700x, your doing it wrong mate. lol  i understand supply issues though so yeah take what you can get at the moment.


Yep, I was gonna point that out but it seems it has been clarified. And I agree with you, an awesome 6900XT + 2700X is just wrong.....

@Athlonite  NICE! I had a Nitro+ RX 6800 too, awesome card, together with your CPU, ought to last you quite a while. Like mine, I was toying with the idea of going 5950X/5900X but my present 3900X should be good enough for quite a while too.

Does this ole heart of mine good to see more guys joining in!


----------



## ratirt (Feb 5, 2021)

GamerGuy said:


> Yep, I was gonna point that out but it seems it has been clarified. And I agree with you, an awesome 6900XT + 2700X is just wrong.....
> 
> @Athlonite  NICE! I had a Nitro+ RX 6800 too, awesome card, together with your CPU, ought to last you quite a while. Like mine, I was toying with the idea of going 5950X/5900X but my present 3900X should be good enough for quite a while too.
> 
> Does this ole heart of mine good to see more guys joining in!


I know that guys but you know, everything in turn  Next up is new monitor and then we can speak processors. I can squeeze more performance from that 2700x and I will try probably at some point and re-benchmark again.
I had a 6800XT nitro card in my hands boy oh boy it was a doozy of a card. These cards are really amazing. Besides, whenever I look at reviews, all the reviewers say that the reference cooler is just amazing. The AIBs had a hard time to make these cards actually better than a reference ones.

BTW.
Did any of you try undervolting these things and how does that correspond to frequency and power? How far were you able to undervolt?


----------



## Nuckles56 (Feb 5, 2021)

I'm currently trying to pick up a 6800 to replace my 1080ti, as it would be a nice step up for me and much better for CP 2077 @4k


----------



## kapone32 (Feb 5, 2021)

I have to update my specs but I am now the proud owner of a 6800XT MSI Gaming X Trio.


----------



## GamerGuy (Feb 5, 2021)

ratirt said:


> I know that guys but you know, everything in turn  Next up is new monitor and then we can speak processors. I can squeeze more performance from that 2700x and I will try probably at some point and re-benchmark again.
> I had a 6800XT nitro card in my hands boy oh boy it was a doozy of a card. These cards are really amazing. Besides, whenever I look at reviews, all the reviewers say that the reference cooler is just amazing. The AIBs had a hard time to make these cards actually better than a reference ones.


Just ribbing you, no malice or offense meant. I love the Red Devil, I have a PC RX VEGA64 Red Devil and would have loved the RX 6900 XT version of it, but the Sapphire Nitro+ RX 6900 XT sorta fell into my lap (and there were no sign of the Red Devil around). 'Grats on scoring one!


Nuckles56 said:


> I'm currently trying to pick up a 6800 to replace my 1080ti, as it would be a nice step up for me and much better for CP 2077 @4k





kapone32 said:


> I have to update my specs but I am now the proud owner of a 6800XT MSI Gaming X Trio.


Nice! Don't forget to update your system specs in your profile!


----------



## ratirt (Feb 5, 2021)

GamerGuy said:


> Just ribbing you, no malice or offense meant. I love the Red Devil, I have a PC RX VEGA64 Red Devil and would have loved the RX 6900 XT version of it, but the Sapphire Nitro+ RX 6900 XT sorta fell into my lap (and there were no sign of the Red Devil around). 'Grats on scoring one!


No offense taken.
Hehe. got a Red Devil V64 too. I could get the Nitro+ 6800 xt but I wanted red devil and I ended up with the 6900 XT.


----------



## kapone32 (Feb 5, 2021)

kapone32 said:


>


It's not 2 cards but just one. I thought my Sapphire Nitro Vega 64 was a big card. While the Nitro was a 12" long card so is this but the width is insane. I have to see how it performs with my 5600X system but it will have to wait as the card cannot fit in the case that has my 5600x lol.


----------



## kapone32 (Feb 8, 2021)

Well the weekend passed and I am going to share my experience with the 6800XT. As I stated before I had the card installed in my X399 build with a 2920X that boosts to 4.3 GHZ on the best cores. Using that system I got Firestrike scores of 25814 and the TWWH2 battle benchmark gave me 92.4 FPS. I ran a few others and they were all about 10 to 15% faster than my Vega 64. 

On Saturday my Antec A400 arrived and it took about 45 minutes to change everything (MSI X570 Pro>>MSI X570 Unify)(RX5700>>6800XT)(Nepton 280<<<A400)(Gigabyte 650B>>XPG Gold 750) . When I ran the same benchmarks with my 5600x @ 4.7 ghz. TWWH2 gave me a 154 FPS average but Firestrike was *a ridiculous 35133 almost 10000 points higher. *

When it comes to raw Gaming TWWH2 is still a resource hog so I will say the lowest we got was 47 FPS at anypoint in time playing a 180 Turn Campaign. Older Games like Grim Dawn 170 FPS with no deviation. The clock would run anywhere from 1800 to a more consistent 2300 MHZ. 

The next thing that I will do is BAR enabled or disabled. It would seem that even that has a discernable improvement but I don't have enough data to state that.

What I can say is that the 6000 series are up to 85% faster than the 5000 series. You will not fully realize that though unless you pair it with a 5000 series chip. As the 5600X is the lowest in the stack I can see that your experience should be at least the same if you have a 5800 and up. 

The Caveat is the size of the card. As far as I am concerned a triple slot is way too big.


----------



## sepheronx (Feb 8, 2021)

What resolution did you test the game in?  It would have been cool if you tested with that GPU between multiple resolutions on both systems to see where the CPU difference is negligent.


----------



## GerKNG (Feb 8, 2021)

i totally underestimated the size of the 6900XT merc 319...

but who needs DLSS when you have 2.7 Ghz?  (i have more FPS in games like Black Ops: Cold War at native 1440p than with my 3080 at 450W with DLSS in Quality Mode)

only the artifacts in screen space reflections are still a thing that needs to be fixed.


----------



## kapone32 (Feb 8, 2021)

sepheronx said:


> What resolution did you test the game in?  It would have been cool if you tested with that GPU between multiple resolutions on both systems to see where the CPU difference is negligent.


Both Computers were hooked up to the Gigabyte 32QC at 1440P. That is a 165HZ VA. My data seems to suggest that the CPU does matter as the 2920X did not behave any differently in Firstrike vs past tests (which doesn't seem to care about resolution).



GerKNG said:


> i totally underestimated the size of the 6900XT merc 319...
> 
> but who needs DLSS when you have 2.7 Ghz?  (i have more FPS in games like Black Ops: Cold War at native 1440p than with my 3080 at 450W with DLSS in Quality Mode)
> 
> only the artifacts in screen space reflections are still a thing that needs to be fixed.


The only complaint I have is that the high memory clock in idle when hooked up to a high refresh monitor. My 6800Xt runs at 1986 MHZ all day long attached to my 32QC.


----------



## GerKNG (Feb 8, 2021)

kapone32 said:


> Both Computers were hooked up to the Gigabyte 32QC at 1440P. That is a 165HZ VA. My data seems to suggest that the CPU does matter as the 2920X did not behave any differently in Firstrike vs past tests (which doesn't seem to care about resolution).
> 
> 
> The only complaint I have is that the high memory clock in idle when hooked up to a high refresh monitor. My 6800Xt runs at 1986 MHZ all day long attached to my 32QC.


yep that's true. constantly sitting at it's max clockspeed.


----------



## kapone32 (Feb 8, 2021)

GerKNG said:


> yep that's true. constantly sitting at it's max clockspeed.


The thing for me though is that even then the cooler keeps the card in the 30s which is much better than the 5000 series.


----------



## GerKNG (Feb 8, 2021)

kapone32 said:


> The thing for me though is that even then the cooler keeps the card in the 30s which is much better than the 5000 series.


but that's my real problem.


----------



## kapone32 (Feb 8, 2021)

GerKNG said:


> but that's my real problem.


Are you using a Freesync monitor.


----------



## GerKNG (Feb 8, 2021)

kapone32 said:


> Are you using a Freesync monitor.


Yes.


----------



## kapone32 (Feb 8, 2021)

GerKNG said:


> Yes.


I guess the 6900XT is pushing more Frames than your monitor can handle and must be outside of the Freesync range. That sounds like a nice problem to have though.


----------



## GerKNG (Feb 8, 2021)

kapone32 said:


> I guess the 6900XT is pushing more Frames than your monitor can handle and must be outside of the Freesync range. That sounds like a nice problem to have though.


that has nothing to with artifacting.


----------



## kapone32 (Feb 8, 2021)

GerKNG said:


> that has nothing to with artifacting.


What do you think it is?


----------



## ratirt (Feb 8, 2021)

Guys, Have you noticed that CB2077 has the FidelityFX option? I tried using (not extensively) it and it does boost the FPS depending which one you use. Static or Dynamic. Do you guys see any difference in FPS? 
With my 6900XT I've noticed more FPS when I've set the Dynamic 80 to 100. 15-20FPS more. Maybe I need to do some more testing but it would seem it works. Not sure about the image quality it was just to check if I would get more FPS.


----------



## kapone32 (Feb 8, 2021)

ratirt said:


> Guys, Have you noticed that CB2077 has the FidelityFX option? I tried using (not extensively) it and it does boost the FPS depending which one you use. Static or Dynamic. Do you guys see any difference in FPS?
> With my 6900XT I've noticed more FPS when I've set the Dynamic 80 to 100. 15-20FPS more. Maybe I need to do some more testing but it would seem it works. Not sure about the image quality it was just to check if I would get more FPS.


For some reason I cannot even enable Ray Tracing in CP2077.


----------



## sepheronx (Feb 8, 2021)

kapone32 said:


> Both Computers were hooked up to the Gigabyte 32QC at 1440P. That is a 165HZ VA. My data seems to suggest that the CPU does matter as the 2920X did not behave any differently in Firstrike vs past tests (which doesn't seem to care about resolution).
> 
> 
> The only complaint I have is that the high memory clock in idle when hooked up to a high refresh monitor. My 6800Xt runs at 1986 MHZ all day long attached to my 32QC.


Hence why I was asking about the game you mentioned. Unless I'm wrong in what you tested.

Anyway, from gaming perspective, I've seen where at 1440p and 4k the 5600x wasn't much further ahead than an older 1650 v3 let's say.  I watch myconst review on this stuff.

I'm glad your system is up and running and getting impressive results. I'm rather jealous of your Rx GPU as I tried to buy one but cheapest one was $900 cad (6800 non XT) while $800 after taxes for a 3070.


----------



## kapone32 (Feb 8, 2021)

sepheronx said:


> Hence why I was asking about the game you mentioned. Unless I'm wrong in what you tested.
> 
> Anyway, from gaming perspective, I've seen where at 1440p and 4k the 5600x wasn't much further ahead than an older 1650 v3 let's say.  I watch myconst review on this stuff.
> 
> I'm glad your system is up and running and getting impressive results. I'm rather jealous of your Rx GPU as I tried to buy one but cheapest one was $900 cad (6800 non XT) while $800 after taxes for a 3070.


Totally understood. The rasterization performance of the 6800XT was what sold me. Then Canada Computers was only $30 over MSRP for the MSI card. It was $1149 (CAD) before tax.


----------



## ratirt (Feb 8, 2021)

kapone32 said:


> For some reason I cannot even enable Ray Tracing in CP2077.


I'm talking about FidelityFX not ray tracing.


----------



## Deleted member 202104 (Feb 8, 2021)

kapone32 said:


> For some reason I cannot even enable Ray Tracing in CP2077.



RT in CP2077 is only for Nvidia at this point.  CDPR mentioned AMD support would be enabled eventually.

Oh, and I should probably add my name to the club.  Picked up a Gigabyte branded 6800 reference card a while back.


----------



## kapone32 (Feb 8, 2021)

ratirt said:


> I'm talking about FidelityFX not ray tracing.


Yeah I was just adding. I tried it when I saw it in the menu(don't tell work) doing my next story mission and I could not notice really but I guess it felt smoother.


----------



## Athlonite (Feb 8, 2021)

Yup Fidelity FX works quite well if you don't use silly settings like 50%/100% I've have had mine using 90/100% with both my RX5700 and the RX6800 and it seems to work quite well especially in intense fight scenes and as I'm to busy shootin gonks I don't really notice any scene QA reduction but out n about driving round can lead to the odd bit of blurry textures not alot mind just every now and then so it I'll leave it set to what I have and just enjoy playing


----------



## puma99dk| (Feb 9, 2021)

My MSI Radeon RX 6800 XT Gaming X Trio can be seen here:









						What's your latest tech purchase?
					

Okay folks.  Let's get back on topic.  Maybe consider opening a thread in the proper section to continue your discussion, or take it to PM's.




					www.techpowerup.com


----------



## turbogear (Feb 9, 2021)

GerKNG said:


> i totally underestimated the size of the 6900XT merc 319...
> 
> but who needs DLSS when you have 2.7 Ghz?  (i have more FPS in games like Black Ops: Cold War at native 1440p than with my 3080 at 450W with DLSS in Quality Mode)
> 
> only the artifacts in screen space reflections are still a thing that needs to be fixed.


You are a very lucky as your 6900XT reaches 2.7GHz. 
As far as I have read in forums there are  not many 6900XT that can achieve that.

My 6800XT can stably reach 2580-2612MHz in games with 15% powerlimit.
I can achieve upto 2680MHz with MPT but that is not stable in all games.


----------



## ratirt (Feb 9, 2021)

Athlonite said:


> Yup Fidelity FX works quite well if you don't use silly settings like 50%/100% I've have had mine using 90/100% with both my RX5700 and the RX6800 and it seems to work quite well especially in intense fight scenes and as I'm to busy shootin gonks I don't really notice any scene QA reduction but out n about driving round can lead to the odd bit of blurry textures not alot mind just every now and then so it I'll leave it set to what I have and just enjoy playing


I've noticed the same thing. 4k maxed out and I got 60FPS with the fidelity. I need to get some time and maybe take a closer look at this.


----------



## turbogear (Feb 9, 2021)

kapone32 said:


> The only complaint I have is that the high memory clock in idle when hooked up to a high refresh monitor. My 6800Xt runs at 1986 MHZ all day long attached to my 32QC.


It is the same situation for me with my ASUS ROG Strix XG27WQ which is also 165Hz monitor.
If I change monitor refresh rate to 144Hz then memory clock drops otherwise it runs in the same range as you mentioned.


----------



## ratirt (Feb 9, 2021)

turbogear said:


> It is the same situation for me with my ASUS ROG Strix XG27WQ which is also 165Hz monitor.
> If I change monitor refresh rate to 144Hz then memory clock drops otherwise it runs in the same range as you mentioned.


Oh damn. Really? I haven't noticed that. I hope this is something that can be fixed by some sort of a driver tweak.


----------



## kapone32 (Feb 9, 2021)

ratirt said:


> Oh damn. Really? I haven't noticed that. I hope this is something that can be fixed by some sort of a driver tweak.


It is something that has been effecting RDNA since the 5700XT. I don't see it going away anytime soon.


----------



## turbogear (Feb 9, 2021)

ratirt said:


> Oh damn. Really? I haven't noticed that. I hope this is something that can be fixed by some sort of a driver tweak.


I am not sure if they will/can fix this or not.  
I was told by some advanced users at Guru3D that the problem lies that VRAM may not be able to support intermediate frequency steps between max and min.
For these high refresh rates the low frequency state of VRAM may not be enough. An intermediate frequency would be needed and I am not sure if that could be done for GDDR6.
HBM had that possibility with previous generations of AMD GPUs.

Let's hope they find a solution.
It would be good that we keep repoting this bug through Radeon Software. I reported it already in January. I will report it again for 21.2.1.









						RDNA2 RX6000 Series Owners Thread, Tests, Mods, BIOS & Tweaks !
					

Heh, have my piece now. 6800xt merc319.  Finally you were able to get one. :) Enjoy the performance.




					forums.guru3d.com


----------



## Athlonite (Feb 9, 2021)

Yeah I don't think it's just a high refresh rate monitor problem as I was seeing the same thing on my RX5700 and 24" 75Hz Viewsonic VA panel monitor it's not doing it with the RX6800 though 
Evan after AMD said they'd fixed the issue I still had it happen on the RX5700 I still think it's a driver problem not a GPU/Monitor problem


----------



## kapone32 (Feb 9, 2021)

Colour me impressed! 186FPS average in Division 2 but before that in 4K 204 FPS playing Squadrons. I can officially state that this card is faster than 2 Vega 64s in Crossfire supported Games like TWWH2, AOTS, Jedi Fallen Order, Division 2 and Squadrons.


----------



## 68Olds (Feb 10, 2021)

I'm finally in the club!  A Powercolor Red Dragon came today.


----------



## ratirt (Feb 10, 2021)

68Olds said:


> I'm finally in the club!  A Powercolor Red Dragon came today.
> 
> View attachment 187754


Welcome to the club younger sibling  
Not sure how much different the Red Dragon is in comparison to Red Devil. Probably the shroud is made with less metal parts and but I wouldn't know for sure.


----------



## Athlonite (Feb 10, 2021)

ratirt said:


> I've noticed the same thing. 4k maxed out and I got 60FPS with the fidelity. I need to get some time and maybe take a closer look at this.


I wouldn't go lower than 80~85% as it starts to look quite nasty lower than that


----------



## PlayToBeat (Feb 10, 2021)

If I increase the power limit to 15%, my temps go as high as 100C (junction) and the GPU hits 350W sometimes.
If I don't increase the power limit at all, my temps around 90C (junction) and GPU hits 300W sometimes.



Any reason why should I increase?


----------



## ratirt (Feb 10, 2021)

PlayToBeat said:


> If I increase the power limit to 15%, my temps go as high as 100C (junction) and the GPU hits 350W sometimes.
> If I don't increase the power limit at all, my temps around 90C (junction) and GPU hits 300W sometimes.
> View attachment 187778
> Any reason why should I increase?


you have 6800XT as I see.
I don't see your Jtemp go 100. Your manual fan curve is set to 60% when the temp does reach 100c.
Maybe you should use temp monitoring to see what the max jtemp actually is when you OC and then adjust fan curve or set it to automatic just to get some default settings data.


----------



## PlayToBeat (Feb 10, 2021)

ratirt said:


> you have 6800XT as I see.
> I don't see your Jtemp go 100. Your manual fan curve is set to 60% when the temp does reach 100c.
> Maybe you should use temp monitoring to see what the max jtemp actually is when you OC and then adjust fan curve or set it to automatic just to get some default settings data.


It hits 100C when I use 3DMARK benchmark.
I didn't even notice the 60% - need to change that of course. Thanks.
About the power limit, any cons?


----------



## ratirt (Feb 10, 2021)

PlayToBeat said:


> It hits 100C when I use 3DMARK benchmark.
> I didn't even notice the 60% - need to change that of course. Thanks.
> About the power limit, any cons?


Well, it is obvious that when you use higher power target the temp will go up. If you OC at the same time the temp will also go up respectively. The Junction temp is a hot spot temp. So it is supposedly be hitting 110c before the GPU will throttle. Of course you want to keep it as cool as possible but that's not an odd behavior.
3d Mark is a benchmark so it stresses the card to its limits. No wonder you get this type of readings. What card brand you've got?

I think I got it wrong. The fan curve is set to 80% while at 100c. It is something to fiddle with anyway.
BTW. You can also try increasing power limit to maximum but also lowering vcore trying by 10 at a time. But you will have to test for stability.


----------



## PlayToBeat (Feb 10, 2021)

ratirt said:


> Well, it is obvious that when you use higher power target the temp will go up. If you OC at the same time the temp will also go up respectively. The Junction temp is a hot spot temp. So it is supposedly be hitting 110c before the GPU will throttle. Of course you want to keep it as cool as possible but that's not an odd behavior.
> 3d Mark is a benchmark so it stresses the card to its limits. No wonder you get this type of readings. What card brand you've got?
> 
> I think I got it wrong. The fan curve is set to 80% while at 100c. It is something to fiddle with anyway.
> BTW. You can also try increasing power limit to maximum but also lowering vcore trying by 10 at a time. But you will have to test for stability.


RX 6800XT Nitro+
this is the lowest vcore I could get
I suspect maybe the airflow in the case is bad. I'll do some test.


----------



## ratirt (Feb 10, 2021)

PlayToBeat said:


> RX 6800XT Nitro+
> this is the lowest vcore I could get
> I suspect maybe the airflow in the case is bad. I'll do some test.


Well, if you don't have a sufficient airflow in your case that is something to look at. 
When I get home I will 3dmark few times in a row my 6900 xt red devil and see what temps I get. Didn't have a lot of time to look at it yet.


----------



## PlayToBeat (Feb 10, 2021)

ratirt said:


> Well, if you don't have a sufficient airflow in your case that is something to look at.
> When I get home I will 3dmark few times in a row my 6900 xt red devil and see what temps I get. Didn't have a lot of time to look at it yet.


Alright, waiting for your results.


----------



## Kirsebaer (Feb 10, 2021)

Was getting WHEA errors for a month and was troubleshooting BIOS versions, RAM settings etc....turned out to be hwinfo -_- The new beta of hwinfo has fixed it. Just an FYI.


----------



## Felix123BU (Feb 10, 2021)

Have a 6800 XT ref, will post pics and info when my water loop arrives, plan to switch the whole system to loopy loop water


----------



## ratirt (Feb 10, 2021)

PlayToBeat said:


> Alright, waiting for your results.


Here is my HWINFO. The Hot spot is the junction temp and it hasn't gone above 89c.


----------



## turbogear (Feb 10, 2021)

PlayToBeat said:


> If I increase the power limit to 15%, my temps go as high as 100C (junction) and the GPU hits 350W sometimes.
> If I don't increase the power limit at all, my temps around 90C (junction) and GPU hits 300W sometimes.
> View attachment 187778
> Any reason why should I increase?


For Nitro+ 6800XT you have GPU power limit set to 290W by default compared to 255W on reference design card.
This mean when you apply 15% powerlimit you allow 332.35W to GPU core. You need to add to that around 45W for the other components of the GPU which means you have a total of around 377W.
This is close to the limit of what 2×8pin and PCIe slot can supply a combination of 375W.  You are driving your card on it's limits. 

Maybe you need further tuning. I thought Nitro+ can achieve better results than reference cards with respect to overclock and under volt.

A good air circulation hopefully will help with the temperatures.
I have Cooler Master Cosmos C700P which provides a lot of air flow with 2 fans at bottom for air suck in  and 6 fans on radiators that push the air out.
Before I went water cooling for 6800XT, my temperatures were 67°C and junction 90°C max under stress tests. At that time, I had modified fan curve to go to 100% fan speed relatively fast.

I have water cooled setup now. The 6800XT has maximum 47°C and Junction of 67°C in stress tests but Junction temperature in games is around 54°C. 

I have reference XFX 6800XT and I use 275W for GPU core and 305A TDC applied by using MPT additionally I have 15% power limit in Wattman.
This means the GPU core is allowed to take 316.25W and total GPU power is around 361W.
I am able to achieve 2580-2612MHz clock in games like Assassin's Creed Valhalla.

My setting in Wattman are 2650MHz@970mV.
These are my settings for daily gaming and not just to achieve highest score for short Benchmark. 
I am achieving a score of 19376 in Time Spy with this.

So far very stable in all games that I recently played (Borderland 3, Assassin's Creed Valhalla, Cyberpunk). 


Mine does not like VRAM overclock though. Even 50MHz increase on MCLK causes crashes in benchmarks.  Maybe it has to do with SAM. I have not tried it without SAM.


----------



## 68Olds (Feb 10, 2021)

ratirt said:


> Welcome to the club younger sibling
> Not sure how much different the Red Dragon is in comparison to Red Devil. Probably the shroud is made with less metal parts and but I wouldn't know for sure.


The Dragon has a lower factory overclock than the Devil.  Not sure that makes a difference in maximum clocks provided I can keep it cool enough.    The Devil might allow for higher power limit too, but I haven't got into any overclocking yet.


----------



## GamerGuy (Feb 11, 2021)

Yay! More and more are joining the club!!! 

As for refresh rate and memory clock, does this happen with Freesync monitor at 144Hz? I ask because I have a Freesync 2 monitor at 144Hz, and my GPU MCLK stays at between 24Mhz to 196Mhz on desktop, haven't seen it go higher than the latter.


----------



## Deleted member 202104 (Feb 11, 2021)

GamerGuy said:


> Yay! More and more are joining the club!!!
> 
> As for refresh rate and memory clock, does this happen with Freesync monitor at 144Hz? I ask because I have a Freesync 2 monitor at 144Hz, and my GPU MCLK stays at between 24Mhz to 196Mhz on desktop, haven't seen it go higher than the latter.



On the desktop with my FreeSync Premium Pro at 60Hz and 120Hz my MCLK is similar to yours, at 165Hz it goes full speed.  Power also goes from 9w to 34w.

It's the same behavior as my last card (5700XT).


----------



## GamerGuy (Feb 11, 2021)

weekendgeek said:


> On the desktop with my FreeSync Premium Pro at 60Hz and 120Hz my MCLK is similar to yours, at 165Hz it goes full speed.  Power also goes from 9w to 34w.
> 
> It's the same behavior as my last card (5700XT).


So, I guess it means that at 144Hz (Freesync 2 monitor) it acts as it should, but going up a notch to 165Hz causes the MCLK to go nutz, right? If that's the case, so glad I'd stuck with 144Hz refresh range when I'd upgraded my monitor to my present one. But, I guess this glitch would prolly be resolved with a future driver update....


----------



## Space Lynx (Feb 11, 2021)

GamerGuy said:


> So, I guess it means that at 144Hz (Freesync 2 monitor) it acts as it should, but going up a notch to 165Hz causes the MCLK to go nutz, right? If that's the case, so glad I'd stuck with 144Hz refresh range when I'd upgraded my monitor to my present one. But, I guess this glitch would prolly be resolved with a future driver update....



its really not that big of a deal though, its not taxing the memory hard when idle even though its reporting max speed. its no different than my 5600x all core oc at 46.25 ghz hwifino shows it at 46.25 24/7 but its not actually taxing it until i place load on it. so this doesn't bother me to much honestly.

i'm really impressed with my 5600x and 6800 non-xt. honestly was expecting some issues because of the 5700 xt horror stories (I skipped that generation), but everything has been rock solid for me when not OC'ing. and now that i have OC'ing mastered somewhat, I have had 0 issues for a solid month now with everything oc'd, including monitor.


----------



## turbogear (Feb 11, 2021)

weekendgeek said:


> On the desktop with my FreeSync Premium Pro at 60Hz and 120Hz my MCLK is similar to yours, at 165Hz it goes full speed.  Power also goes from 9w to 34w.
> 
> It's the same behavior as my last card (5700XT).


Yes I have exactly the same situation with my FreeSync Premium Pro monitor.
Upto 144Hz, MCLK is low but setting monitor to 165Hz causes it to go full speed.
For games like Assassin's creed valhalla and Cyberpunk even 144Hz is enough. The FPS does not reach the limit also with my heavily OCed 6800XT.  
In Borderlands 3 if I set refresh to 144Hz, the FPS was often sitting constantly there.


----------



## ratirt (Feb 11, 2021)

turbogear said:


> Yes I have exactly the same situation with my FreeSync Premium Pro monitor.
> Upto 144Hz, MCLK is low but setting monitor to 165Hz causes it to go full speed.
> For games like Assassin's creed valhalla and Cyberpunk even 144Hzis enough. The FPS does not reach the limit also with my heavily OCed 6800XT.
> In Borderlands 3 if I set refresh to 144Hz, the FPS was often sitting constantly there.


I really need to get some time to look at this stuff. I havent noticed anything so far but I keep forgetting. I'm getting a new monitor soon (damn hope it will arrive) and I need to double check that with FreeSync Premium.
BTW. I tried CP2077 yesterday and damn, at some point I got 20FPS 60FPS is spot on hehe. That game is crazily demanding.


----------



## turbogear (Feb 11, 2021)

ratirt said:


> I really need to get some time to look at this stuff. I havent noticed anything so far but I keep forgetting. I'm getting a new monitor soon (damn hope it will arrive) and I need to double check that with FreeSync Premium.
> BTW. I tried CP2077 yesterday and damn, at some point I got 20FPS 60FPS is spot on hehe. That game is crazily demanding.


Did you try undervolting and OCing your Red Devil 6900XT?
It would be interesting to see if they have higher binned ASIC to overclock better.


----------



## ratirt (Feb 11, 2021)

turbogear said:


> Did you try undervolting and OCing your Red Devil 6900XT?
> It would be interesting to see if they have higher binned ASIC to overclock better.


I didn't try much yet. I've got no time. Just simple benchmarks etc. 
The problem with all 6000 series cards is the power limit. it is locked and thus you can't go crazy with it but I plan to check. Maybe over the weekend. I need to get a good benchmark that would leave the cpu alone and just get the card maxed out. Maybe some benchmarks at 4k. My 2700x at 1080p can't keep up with the 6900xt.


----------



## Space Lynx (Feb 11, 2021)

ratirt said:


> That game is crazily demanding.



or badly coded.


----------



## turbogear (Feb 11, 2021)

ratirt said:


> I didn't try much yet. I've got no time. Just simple benchmarks etc.
> The problem with all 6000 series cards is the power limit. it is locked and thus you can't go crazy with it but I plan to check. Maybe over the weekend. I need to get a good benchmark that would leave the cpu alone and just get the card maxed out. Maybe some benchmarks at 4k. My 2700x at 1080p can't keep up with the 6900xt.


In theory your card is able to deliver 525W with 3×8pin connector compare to mine where limit is 375W because of only 2x8pin, but if it is locked and not allowed to go that high then it is a pity. 

3x8pin connector is then kind of useless until somebody modes the bios.


----------



## ratirt (Feb 11, 2021)

turbogear said:


> In theory your card is able to deliver 525W with 3×8pin connector compare to mine where limit is 375W because of only 2x8pin, but if it is locked and not allowed to go that high then it is a pity.
> 
> 3x8pin connector is then kind of useless until somebody modes the bios.


I will have to try it out t some point but I'm worried my PSU is not enough.


----------



## kapone32 (Feb 11, 2021)

ratirt said:


> I didn't try much yet. I've got no time. Just simple benchmarks etc.
> The problem with all 6000 series cards is the power limit. it is locked and thus you can't go crazy with it but I plan to check. Maybe over the weekend. I need to get a good benchmark that would leave the cpu alone and just get the card maxed out. Maybe some benchmarks at 4k. My 2700x at 1080p can't keep up with the 6900xt.


The 2700x has the same chips as the 2920X. Based on my own testing with a 2920X vs a 5600X meant that I was leaving up to 60% performance on the table. As an example the TWWH2 battle benchmark brought back 102 FPS @1440p high with the 2920X and 157 FPS @1440P high using the 5600X. Having said that the 6900XT is about 15% faster than my 6800XT. I wish Asus would get off their butts and release a BARS support BIOS for my X399 STrix. I have said it before but I would really like to see how HBM works with BAR.


----------



## turbogear (Feb 11, 2021)

ratirt said:


> I will have to try it out t some point but I'm worried my PSU is not enough.


Yes Corsair 760W supply that your specs show could be not enough.
I own 1200W Corsair HX1200i which is platinum certified.

I have a lot of other stuff in my PC like 4 HDDs extra raid Controller, 8 fans, lighting and  Laing DDC 18W pump. 

In any case it does not help always to throw more power at GPU.
Depending on ASIC quality at some point your burning power with very little extra performance.

For me the sweet spot is what I mentioned in previous posts. Increasing more TDC and GPU power brings only little higher performance which is also not stable.
My GPU reaches it voltage limit at around 2.7GHz setting.


----------



## Athlonite (Feb 11, 2021)

ratirt said:


> I really need to get some time to look at this stuff. I havent noticed anything so far but I keep forgetting. I'm getting a new monitor soon (damn hope it will arrive) and I need to double check that with FreeSync Premium.
> BTW. I tried CP2077 yesterday and damn, at some point I got 20FPS 60FPS is spot on hehe. That game is crazily demanding.



what res are you playing at and have you tried the Fidelity FX Dynamic setting try 90% to 100% and also turn of motion blur it's a frame rate vampire


----------



## ratirt (Feb 11, 2021)

Athlonite said:


> what res are you playing at and have you tried the Fidelity FX Dynamic setting try 90% to 100% and also turn of motion blur it's a frame rate vampire


I play at 4k. Haven't tried the motion blur but fidelityFX I did. 80 to 100. It does depend where you at in the game and so the FPS drops. Noticed that whenever I run in game the FPS drops. I think it might have something to do with the motion blur.


----------



## turbogear (Feb 11, 2021)

kapone32 said:


> Having said that the 6900XT is about 15% faster than my 6800XT.


I wonder why you say 6900XT is 15% faster than 6800XT.
According to TPU review it is at most 9% faster than 6800XT.








						AMD Radeon RX 6900 XT Review - The Biggest Big Navi
					

AMD's Radeon RX 6900 XT offers convincing 4K gaming performance, yet stays below 300 W. Thanks to this impressive efficiency, the card is almost whisper-quiet, quieter than any RTX 3090 we've ever tested. In our review, we not only benchmark the RX 6900 XT on Intel, but also on Zen 3, with fast...




					www.techpowerup.com
				




With my overclocked 6800XT it is actually faster than reference non overclocked 6900XT.


----------



## IvanP91v (Feb 11, 2021)

GamerGuy said:


> C'mon, guys, surely there're more than two RX 6000 series owners here.....


Thanks to my bro, I finally got this gem. It constantly boosts to 2500Mhz and that's not listed anywhere, not sure if that's normal. Temp,etc are all fine when it is boosting.


----------



## Athlonite (Feb 12, 2021)

IvanP91v said:


> Thanks to my bro, I finally got this gem. It constantly boosts to 2500Mhz and that's not listed anywhere, not sure if that's normal. Temp,etc are all fine when it is boosting.


As long as temps and power consumption are under control the boosting to those speeds should be the norm


----------



## kapone32 (Feb 12, 2021)

turbogear said:


> I wonder why you say 6900XT is 15% faster than 6800XT.
> According to TPU review it is at most 9% faster than 6800XT.
> 
> 
> ...


I took that information from all of the reviews I have watched and factoring SAM into the equation.


----------



## ratirt (Feb 12, 2021)

I've noticed something weird. My 6900xt default settings for vcore is 1175mv. That's like wen you OC you bump the voltage. I need to definitely lower it. For now I got it working at 1.1v but I think I can go lower. I might try OC but i won't get to far with my current PSU. Problem is, there isn't a lot of PSUs for purchase at any store I've checked 850 is a minimum and those are nowhere to be found either.


----------



## turbogear (Feb 13, 2021)

ratirt said:


> I've noticed something weird. My 6900xt default settings for vcore is 1175mv. That's like wen you OC you bump the voltage. I need to definitely lower it. For now I got it working at 1.1v but I think I can go lower. I might try OC but i won't get to far with my current PSU. Problem is, there isn't a lot of PSUs for purchase at any store I've checked 850 is a minimum and those are nowhere to be found either.


1175mV is the stock voltage for 6900XT and 1150mV is the default for 6800XT.
These values are fixed and at the moment it is also not possible to increase further by tools like MPT.
Increasing this will send the GPU into safe mode and lock the clock at 500MHz.

When you change the settings in the Radeon Software by decreasing the voltage, you are changing the boosting behavior.
If you track the voltage by logging for example with HWiNFO, you will see it goes higher than what you set it to, but if you can lower the voltage in Radeon settings you can boost higher. 

I have my 6800XT set at 2650MHz@970mV and in HWiNFO I see the card going to 1100mV with frequency boosting to 2600MHz.
With default settings the boost hits 1150mV limit much faster and frequency cannot go that high within the same power consumption limits.

If you really want to limit the max boost voltage allowed then you need to reduce this in the MPT. Doing this could reduces power consumption for same frequencies.
Some people are able to limit this value to around 1070mV for 6800XT and still reach frequencies in range of 2600MHz.
My card does not like this, especially if I push the frequency that high as I am doing.
The GPU is unstable if I change it to lower than 1130mV on my card.


----------



## kapone32 (Feb 13, 2021)

turbogear said:


> 1175mV is the stock voltage for 6900XT and 1150mV is the default for 6800XT.
> These values are fixed and at the moment it is also not possible to increase further by tools like MPT.
> Increasing this will send the GPU into safe mode and lock the clock at 500MHz.
> 
> ...


I am going to try this


----------



## turbogear (Feb 13, 2021)

kapone32 said:


> I am going to try this


In case you have not used MPT before and want to read more about it before trying something, here is a good article at Igor's Lab about using MPT:
https://www.igorslab.de/en/the-grea...ith-big-navi-and-the-morepower-tool-practice/


----------



## Felix123BU (Feb 13, 2021)

turbogear said:


> For Nitro+ 6800XT you have GPU power limit set to 290W by default compared to 255W on reference design card.
> This mean when you apply 15% powerlimit you allow 332.35W to GPU core. You need to add to that around 45W for the other components of the GPU which means you have a total of around 377W.
> This is close to the limit of what 2×8pin and PCIe slot can supply a combination of 375W.  You are driving your card on it's limits.
> 
> ...


Are those Alphacool parts for the loop? I have something very similar coming next week


----------



## turbogear (Feb 13, 2021)

Felix123BU said:


> Are those Alphacool parts for the loop? I have something very similar coming next week


No my loop consists mainly of EK components (rads, fans, tubing, reservoir and pump combi). 
6800XT block is also from EK.
The CPU block is from XSPC as well as the liquid used in the loop.
I have Aquacomputer MPS which is loop flow and temperature measurement sensor.
I have also Aquacomputer filter in the loop that has two valves that helps when opening the loop so that liquid does not flow out from both ends of the loop.
Only a few of the fittings in the loop are from Alphacool. 

I have been into watercooling since 2014.


----------



## ratirt (Feb 13, 2021)

turbogear said:


> 1175mV is the stock voltage for 6900XT and 1150mV is the default for 6800XT.


I thought it is a bit lower than that but I guess I must have missed something.


----------



## Felix123BU (Feb 13, 2021)

turbogear said:


> No my loop consists mainly of EK components (rads, fans, tubing, reservoir and pump combi).
> 6800XT block is also from EK.
> The CPU block is from XSPC as well as the liquid used in the loop.
> I have Aquacomputer MPS which is loop flow and temperature measurement sensor.
> ...


I see, well, for me its my first full loop, so all look the same from a distance, but I shall get pro soon enough....before breaking stuff at the first try


----------



## turbogear (Feb 13, 2021)

Felix123BU said:


> I see, well, for me its my first full loop, so all look the same from a distance, but I shall get pro soon enough....before breaking stuff at the first try


Youtube is a good friend when building stuff for the first time. 
It helped me a lot when last summer I built my own expensive Cross bike by buying all separate components and putting them together without breaking anything.


----------



## Felix123BU (Feb 13, 2021)

turbogear said:


> Youtube is a good friend when building stuff for the first time.
> It helped me a lot when last summer I built my own expensive Cross bike by buying all separate components and putting them together without breaking anything.


yup, from a theory point of view I'm all set, watched vids, read articles etc. Really hyped. Did everything there was to do with a PC, full water loop is the only thing I have not done, and I have a lot of crazy mods on record regarding cooling, but I am really hyped to see how my 6800 XT, and the whole system hums along under water   Was not as impressed by a GPU since I got the ATI 4870 in 2008, but this 6800 XT blew me away, I really want to play around with it under optimal conditions


----------



## turbogear (Feb 14, 2021)

Felix123BU said:


> yup, from a theory point of view I'm all set, watched vids, read articles etc. Really hyped. Did everything there was to do with a PC, full water loop is the only thing I have not done, and I have a lot of crazy mods on record regarding cooling, but I am really hyped to see how my 6800 XT, and the whole system hums along under water   Was not as impressed by a GPU since I got the ATI 4870 in 2008, but this 6800 XT blew me away, I really want to play around with it under optimal conditions


Looking forward to see your setup in action when you have it completed. 

Just for curiosity  which high static pressure fans did you order?
I am using EK Vardar.
How many radiator do you plan to use?
I have three 1x120mm, 1x240mm and 1x360mm.
I use the EK medium thick radiators (PE) because there is no place in my case for push pull configuration. Thick radiator (XE ) need push pull for optimum cooling.

For thermal past, I am using Thermal Grizzly Kryonaut. 
I was thinking to buy the  Kryonaut Extreme but in November when I built my 6800XT block only the big 33.84g package was available for ~87€. Now 2g is also available but still for 20€. 

Indeed 6800XT is a great card. I have been with red team since the days of 4870. All my previous cards were since then Radeons.


----------



## Felix123BU (Feb 14, 2021)

turbogear said:


> Looking forward to see your setup in action when you have it completed.
> 
> Just for curiosity  which high static pressure fans did you order?
> I am using EK Vardar.
> ...


I am getting the loop mainly for the 6800 XT, but also to try full loop cooling.

As for fans, have a couple of Noctua P12's, they are really good on the current AIO, have a couple of Noctua P14 redux 1500rpm, another couple of Noctua A14 industrial 2000 rmp, will see which ones are the best combo, but want to fit the P12s on the 2x280mm radiators, 30mm thickness, thicker ones wont really fit in the current  case. Fans will be push only.

The 6800 XT will get the Eisblock Aurora Acryl GPX-A Radeon RX 6800XT block, and the 5800X will get the Alphacool Eisblock XPX block.
One thing I am not sure if I made the best choice is the pump, its a DC-LT 2600 rpm small pump. But in case I am not pleased, will probably get a stronger one. Wanted it to be quiet mainly, so in 2-3 days will see how it goes. For pastes. have a lot lying around, will probably choose between the Thermal Grizzly Kryonaut or the Gelid GC Extreme, both have had great results on GPUs.

And I have also been Radeon only since the 4870, whenever there was time for an upgrade, the best choices keeping in mind price/performance/longevity was always a Radeon card, though I wanted to give Nvidia a chance with the 3080, but there still is none to be found.

Was more than happy to get a 6800 XT on launch day   Will post pics when the new setup is complete, and hopefully the result will be


----------



## xenosys (Feb 14, 2021)

New poster, and proud owner of a new 6800XT!  Just up picked a Power Color Reference model last week.  Had to pay £100 over the MSRP to secure it but well worth it in the end once I got it home, put it under water and started running some benchmarks.

I bought myself an EK Quantum-Vector Water Block and installed it a few days ago as well, and put it on a custom loop with a couple of EK 360mm Radiators (1 x PE and 1 x XE) to cool both the CPU and GPU.

On idle, I'm around 30 degrees and on full load during a Port Royale/Time Spy stress test, I'm around 54-55 degrees with a Junction Temp. of around 70 after an hour of looping.  During gaming, I'm usually around 50/66-67 respectively.

During a TimeSpy bench, I was hitting a graphics score of 20329 just by tinkering with the Radeon software and under volting.  Setting my core clock slider between 2600-2700mhz, 960mv, and 2110mhz Fast Timings on the memory.  It would stay at a consistent 2620-2620mhz on benches.  When setting the "auto overclock" feature on the software, it would set it as 2508mhz automatically, so I knew immediately I had a pretty good card on my hands.

I think tinkered around a bit more with MPT (More Power Tool) and increased the total wattage output on the card from 300w to 340w, and managed to get it benchmarkable on the slider between 2690-2790mhz and outputting around 2740mhz whilst increasing the voltage to 1085mv.  I did manage to then score 20900+ on TimeSpy (screenshot attached) but for some reason I had an issue with System Mark so couldn't validate the score.  It wasn't stable though.

Then, put it back down to 2655-2755 on the slider and scored a 20650.  This was the most stable I've managed to get it since.

All in all, I must have lucked out with the silicon lottery to pick up a card that decent on OC'ing, so I'm quite happy to have paid the extra £100 for it in hindsight.


----------



## turbogear (Feb 14, 2021)

Felix123BU said:


> One thing I am not sure if I made the best choice is the pump, its a DC-LT 2600 rpm small pump. But in case I am not pleased, will probably get a stronger one. Wanted it to be quiet mainly, so in 2-3 days will see how it goes.


I am using EK-DDC 3.2 PWM pump. EK branded Laing with extra EK heat sink mounted at the bottom to help with cooling of the motor and prolong the life of it.

I have the PWM from DDC connected to my ASUS Crosshair VIII hero CPU fan port. 
I have in AI Suite a profile to control the speed of cooling fans and pump depending on CPU temperatures.
When doing normal office stuff like surfing, the loop is quiet and ramp up higher when gaming.
I allow it ramp up quite high while gaming, producing more noise but optimum temperatures.
I game in any case with headset so it does not disturb much. 

One thing that I noticed after I went water cooling with 6800XT reference card is that there is significant coil whine. 
With original cooler it was hidden behind the noise of cooler but the Vardar fans from cooling loop don't make that high noise and I hear it always if I run some 3D task like Time Spy whie not wearing headset.


----------



## Felix123BU (Feb 14, 2021)

xenosys said:


> New poster, and proud owner of a new 6800XT!  Just up picked a Power Color Reference model last week.  Had to pay £100 over the MSRP to secure it but well worth it in the end once I got it home, put it under water and started running some benchmarks.
> 
> I bought myself an EK Quantum-Vector Water Block and installed it a few days ago as well, and put it on a custom loop with a couple of EK 360mm Radiators (1 x PE and 1 x XE) to cool both the CPU and GPU.
> 
> ...


Also have the reference form Power color, and mine is also a vey good overclocker, even with the stock cooler, can do 2.6 on stock air, and will see how much more it can on water. Temps become an issue after 2.6, hence the coming water loop.



turbogear said:


> I am using EK-DDC 3.2 PWM pump. EK branded Laing with extra EK heat sink mounted at the bottom to help with cooling of the motor and prolong the life of it.
> 
> I have the PWM from DDC connected to my ASUS Crosshair VIII hero CPU fan port.
> I have in AI Suite a profile to control the speed of cooling fans and pump depending on CPU temperatures.
> ...


 coil whine, oh yeah, the whine is strong in this one, but I also game with headsets, so noise while gaming is not an issue, what I want is quiet while working. Seems all reference 6800 XT have coil whine. Was thinking of doing some coil modding, but seems overkill right now.
Regarding pump, might have not made the ideal choice, but as long as it moves water and the temps are at least decent, I can live with it until I can get a stronger pump, if needed.  Right now I have a Corsair H90 AIO on the 6800XT, thats only a 140mm rad, has absolutely no issues with cooling the gpu chip, but the Vram chips need better cooling.


----------



## turbogear (Feb 14, 2021)

xenosys said:


> I think tinkered around a bit more with MPT (More Power Tool) and increased the total wattage output on the card from 300w to 340w, and managed to get it benchmarkable on the slider between 2690-2790mhz and outputting around 2740mhz whilst increasing the voltage to 1085mv.  I did manage to then score 20900+ on TimeSpy (screenshot attached) but for some reason I had an issue with System Mark so couldn't validate the score.  It wasn't stable though.
> 
> Then, put it back down to 2655-2755 on the slider and scored a 20650.  This was the most stable I've managed to get it since.
> 
> All in all, I must have lucked out with the silicon lottery to pick up a card that decent on OC'ing, so I'm quite happy to have paid the extra £100 for it in hindsight.


Welcome to owners Club. 
You are lucky guy and got a very good performing card on your hands.
Mine can most stably reach 2580-2612MHz with 19372 Time Spy score.


----------



## xenosys (Feb 14, 2021)

turbogear said:


> Welcome to owners Club.
> You are lucky guy and got a very good performing card on your hands.
> Mine can most stably reach 2580-2612MHz with 19372 Time Spy score.



Thanks! Yeah, definitely lucked out.

I was getting a stable 2610-20mhz without using MPT, so a similar clock to yours.

However, I was only getting 18000-19000 when I first started tinkering around with the settings.  Most people would be tempted just to max out all the sliders immediately, see what they score, and leave it there if it's stable. This card seems to benefit a lot from "undervolting" in Wattman.  As soon as I started adjusting the core clock along with the voltage, the score just kept going up and up, until I hit 20k.


----------



## turbogear (Feb 14, 2021)

xenosys said:


> Thanks! Yeah, definitely lucked out.
> 
> I was getting a stable 2610-20mhz without using MPT, so a similar clock to yours.
> 
> However, I was only getting 18000-19000 when I first started tinkering around with the settings.  Most people would be tempted just to max out all the sliders immediately, see what they score, and leave it there if it's stable. This card seems to benefit a lot from "undervolting" in Wattman.  As soon as I started adjusting the core clock along with the voltage, the score just kept going up and up, until I hit 20k.


Yes, it takes a lot of time to find the max performance.
Like you, I played with a lot of parameters and MPT.
In the beginning I was not getting that high and after many trials found best for me.
2650MHz@970mV with 275W for GPU and 305 TDC in MPT plus 15% power limit in Radeon Software gave me best.

Above 2650MHz setting in radeon nothing is very stable for me on long term.
I am able to set 2750MHz@1025mV (290W and 350TDC in MPT Plus 15% power limit) which gets me ~19750 in Time Spy but not really stable in games. With that settings I often got crashes in Assassin's creed valhalla, but 2650MHz@970mV is rock stable.


----------



## xenosys (Feb 14, 2021)

Felix123BU said:


> Also have the reference form Power color, and mine is also a vey good overclocker, even with the stock cooler, can do 2.6 on stock air, and will see how much more it can on water. Temps become an issue after 2.6, hence the coming water loop.



Sounds like you might have a nice overclocker on your hands as well!  

When you use the Radeon software presets to auto-overlock, what core clock number does it give you?


----------



## Felix123BU (Feb 14, 2021)

xenosys said:


> Sounds like you might have a nice overclocker on your hands as well!
> 
> When you use the Radeon software presets to auto-overlock, what core clock number does it give you?


I never used the auto-oc presets, played around with it manually,, and max I can get game stable  is set to 2750 max and 2650 min, stays around 2680 to 2710 in game 100% stable, but I don't use it on a day to day basis because on the stock cooler it gets to hot for my liking even with an aggressive fan curve. Undervolting is a must for these frequencies in my opinion, you can get it  running with stock voltage, but the clocks wont stay that high without undervolting. Until the water loop components arrive, I keep it at 2550 max and 2450 min, which give me 2.5 in games.


----------



## turbogear (Feb 14, 2021)

xenosys said:


> Sounds like you might have a nice overclocker on your hands as well!
> 
> When you use the Radeon software presets to auto-overlock, what core clock number does it give you?


I did not try the auto-overclock before. 
I tried it right now and I get 2509MHz which is similar to yours. 
I guess I need to investigate further. 
Maybe there is a possibility to go higher on the overclock than I achieved so far.   
I have to say that I did not try that high voltage when I was using 2750MHz. 
I went to around 1030mV and I see that you go up to  1085mV.


----------



## xenosys (Feb 14, 2021)

turbogear said:


> I did not try the auto-overclock before.
> I tried it right now and I get 2509MHz which is similar to yours.
> I guess I need to investigate further.
> Maybe there is a possibility to go higher on the overclock than I achieved so far.
> ...


I'd definitely do it.  Usually a 2500mhz+ on the auto-overclock recommendation is a sign of a good OC card.  Keep in mind that I did need to use MPT to adjust the wattage of the card to 340w in order to support that clock speed.


----------



## turbogear (Feb 15, 2021)

xenosys said:


> I'd definitely do it.  Usually a 2500mhz+ on the auto-overclock recommendation is a sign of a good OC card.  Keep in mind that I did need to use MPT to adjust the wattage of the card to 340w in order to support that clock speed.


I tried tuning it further after our discussion.
I was able to achieve 20121 score. 

The settings are the similar to what I tried before and that got me before ~19750 score.
The difference is that I fiddled with VRAM now and found strangely that 2050MHz was causing issues  before  but it accepted 2100MHz without issues.Higher than 2100MHz caused benchmark crash.
Increasing the VRAM clock brought the extra performance which was missing to reach 20K score when I last tried similar setting.

I am not too confident though that this setting is fully stable. I did had one time benchmark crash today when I increased clock setting to 2755MHz from 2750MHz. 
I was not able to stabilize it last time I tried similar settings.

I will try tomorrow to run stability tests. It is getting too late in Germany now.


----------



## GamerGuy (Feb 15, 2021)

Keep 'em comin', guys! This thread is growin'!!!


----------



## ratirt (Feb 15, 2021)

I will need to try OC but i will need some time which I don't have at the moment. I tried a bit with the voltage but it would seem, going lower than 1volt is out of the picture. Also I would need to purchase new PSU and there is absolutely nothing available. :/ Still waiting for some stuff to get to my place. I need to admit, that the Red Devil I got, stays relatively cool but I know there's room for improvement.


----------



## xenosys (Feb 15, 2021)

turbogear said:


> I tried tuning it further after our discussion.
> I was able to achieve 20121 score.
> 
> The settings are the similar to what I tried before and that got me before ~19750 score.
> ...



Nicely done my friend. Welcome to the 20k club! 

I was in a similar position with my memory settings.  For some reason my card just doesn't like the memory at 2150mhz, whereas I've seen a lot of others maxing out that particular setting.  My stable setting is anywhere between 2100-2120Mhz.  Anything higher that 2130mhz will definitely produce a green screen of death.  For every 10mhz overclock on memory, TimeSpy gives you an extra 20-30 points approx.

I'd run a Timespy GFX2 test on loop for about an hour to test stability.



ratirt said:


> I will need to try OC but i will need some time which I don't have at the moment. I tried a bit with the voltage but it would seem, going lower than 1volt is out of the picture. Also I would need to purchase new PSU and there is absolutely nothing available. :/ Still waiting for some stuff to get to my place. I need to admit, that the Red Devil I got, stays relatively cool but I know there's room for improvement.



With a Red Devil, it should be capable of doing a good overclock, especially if you keep the temps low.  The Red Devil are some of the better binned cards.


----------



## ratirt (Feb 15, 2021)

turbogear said:


> I tried tuning it further after our discussion.
> I was able to achieve 20121 score.
> 
> The settings are the similar to what I tried before and that got me before ~19750 score.
> ...


Damn. The 5800x CPU does the trick for your 6800XT.


----------



## kapone32 (Feb 15, 2021)

I was having an issue over the weekend where after a long session my PC would just shut down. I thought that it was my Air cooler. Well I had the PSU (XPG Core Reactor 750) installed backwards so the PSU was getting no fresh air. I undervolted the GPU in Adreniline to 10% less. I saw my clocks running at a stable 2400 MHZ. My Timespy score was 18454 with a 5600x. For some reason 3dMark does not register my 4.7 GHZ OC.


----------



## xenosys (Feb 15, 2021)

Here's my 6800XT validated result on TimeSpy that I completed a few moments ago.

Upped the wattage on MPT to 365w, upped the memory to 2130mhz, and adjusted the slider to 2685 - 2785mhz, then crossed my fingers and toes.  It stuck around 2740mhz for most of the test. Ended up with 20930+ again.  There's no way that will be stable in games though.  Did a check on 3D Mark and it's the 12th highest GPU score across the board for a 6800XT.  I'll take that! 

I tried maxing everything out to try and get at least one 21000+ result but kept crashing on GFX Test 2.  No surprises there! 

I've also included my stable clock as well.


----------



## turbogear (Feb 15, 2021)

xenosys said:


> Here's my 6800XT validated result on TimeSpy that I completed a few moments ago.
> 
> Upped the wattage on MPT to 365w, upped the memory to 2130mhz, and adjusted the slider to 2685 - 2785mhz, then crossed my fingers and toes.  It stuck around 2740mhz for most of the test. Ended up with 20930+ again.  There's no way that will be stable in games though.  Did a check on 3D Mark and it's the 12th highest GPU score across the board for a 6800XT.  I'll take that!
> 
> ...


Wow that's a great result.

With 365W on GPU core  you are torturing your GPU VRM my friend.
According to my estimations you need to add around 45W for other components on GPU which means you were drawing around 410W total GPU power which is ~35W above 375W spec of 2x8pin and PCIe slot.


----------



## xenosys (Feb 15, 2021)

turbogear said:


> Wow that's a great result.
> 
> With 365W on GPU core  you are torturing your GPU VRM my friend.
> According to my estimations you need to add around 45W for other components on GPU which means you were drawing around 410W total GPU power which is ~35W above 375W spec of 2x8pin and PCIe slot.



Oh yeah, I've since taken it right back down again, it was just for benchmarking purposes.  I'll take it down to around 350-360w total draw tonight.  Just out of curiosity, what does the other 45w consist of on the card?


----------



## turbogear (Feb 15, 2021)

xenosys said:


> Oh yeah, I've since taken it right back down again, it was just for benchmarking purposes.  I'll take it down to around 350-360w total draw tonight.


Yes I had 275W on core before plus 15% power limit which is around 361W total power that gave me before ~19300 score with 2650MHz@970mV setting.
That is still my target daily gaming setting.  
As I now noticed that 2100MHz VRAM was stable.
I will add that to my above setting and see how high my score goes above 19300

I applied the VRAM 2100MHz overclock and score jumped from 19300 to 19670.


I tuned further the settings to squeeze out more performs in the total GPU power consumption range of 360W.
The max I could achieve with total GPU power of 361W at 310TDC was 19828 on Time Spy and 10158 on Port Royal. 
This is with 2685MHz@980mV.

I got close to the 20K from before without going to the 290W GPU Core (total GPU power of 378W) like I did with previous experiment where score was higher than 20K.
This would be my new daily gaming settings as I don't like to drive the power constantly in 375W range. 

I ran Fire Strike Extreme stability test with a score of 99.5% which can be increased to 99.8% by increasing the voltage to 990mV from 980mV but with that the Time Spy score goes down to 19789 instead of 19828.
I think I will leave it at 980mV and test games to see if this is stable everywhere.


----------



## xenosys (Feb 16, 2021)

turbogear said:


> Yes I had 275W on core before plus 15% power limit which is around 361W total power that gave me before ~19300 score with 2650MHz@970mV setting.
> That is still my target daily gaming setting.
> As I now noticed that 2100MHz VRAM was stable.
> I will add that to my above setting and see how high my score goes above 19300
> ...



Wow, nicely done!

Anything above 10k in TSE is very rare for a 6800XT, but you've topped it with a 158 points to spare.  In fact it puts you 4th on the all time TSE list, according to 3D Mark.

I've noticed that when playing games, it doesn't draw nearly as many watts from the GPU itself that a Firestrike or Timespy stress test does.  Sometimes it would hit 330w for me, but in games, it never usually goes above 300w but still maintains the same core clock rate.


----------



## turbogear (Feb 16, 2021)

xenosys said:


> Wow, nicely done!
> 
> Anything above 10k in TSE is very rare for a 6800XT, but you've topped it with a 158 points to spare.  In fact it puts you 4th on the all time TSE list, according to 3D Mark.
> 
> I've noticed that when playing games, it doesn't draw nearly as many watts from the GPU itself that a Firestrike or Timespy stress test does.  Sometimes it would hit 330w for me, but in games, it never usually goes above 300w but still maintains the same core clock rate.


Thanks.  
I will run it again today in the evening to check if with same settings the 10158 score is reproducible or one time  surprise only.  

Yes I also noticed that  in game power consumption was lower.
In Assassin's creed valhalla the GPU Core consumption for me is usual in range from 220W to 250W. This was tested before on 2650MHz@970mV settings.
I will check with the new 2685@980mV in the next days and log with HWiNFO to get plots for averages and peaks.
I need to run some tests with Cyberpunk as well. I haven't played that since many weeks now.


----------



## ratirt (Feb 16, 2021)

turbogear said:


> In Assassin's creed valhalla the GPU Core consumption for me is usual in range from 220W to 250W. This was tested before on 2650MHz@970mV settings.
> I will check with the new 2685@980mV in the next days and log with HWiNFO to get plots for averages and peaks.


Dude. Your scores and OC is awesome. how did you manage to get 2685Mhz with only 980mv? Did you move the slider for power (15% up) for maximum as well?
I understand the mem is maxed out to 2150Mhz. Have you validated your OC and you have no regression in FPS? I know that people claim a lot when OC but the cards perform worse than expected in games.


----------



## turbogear (Feb 16, 2021)

ratirt said:


> Dude. Your scores and OC is awesome. how did you manage to get 2685Mhz with only 980mv? Did you move the slider for power (15% up) for maximum as well?
> I understand the mem is maxed out to 2150Mhz. Have you validated your OC and you have no regression in FPS? I know that people claim a lot when OC but the cards perform worse than expected in games.


I have 15% power slider plus in MPT GPU power set to 275W and TDC to 310A. So GPU is allowed to consume a total power of 361W compared to default power of 300W.
I am supplying 60W extra to GPU core and 10A more to TDC to go that high in frequency.

I don't have VRAM at 2150MHz. I can only go to 2100MHz. Higher is unstable.

As I explained last time, 980mV is a setting that is used in the Radeon software but that is not the actual voltage that GPU uses when boosting.
You can't reach 2600MHz with only 980mV real GPX voltage. 980mV is a setting in Radeon that does not correspond directly to voltage used to boost. 
The 980mV is only adjusting the boost curve so that towards 1150mV real voltage the GPU can boost much higher than default.

Looking at HWINFO, the actual voltage is going towards 1150mV when GPU boosts towards 2620-2630MHz.
You can't influence the actual voltage that GPU uses to boost through Radeon Setting. It is going higher as far as GPU needs it but limit is 1150mV.
The only way to limit real maximum voltage allowed during boosting is by using MPT. If I put hard limit in MPT to lower than 1130mV the GPU crashes when boosting in direction of 2600MHz.

For 2650MHz@970mV, I did gaming check and FPS was higher.
For this setting, in games GPU held 2580-2612MHz boost range.
The performance was higher.

For 2685MHz@980mV as I mentioned before, I will do validations in the next day by logging data and comparison to default setting performance. This new setting is work in progress.

I am thinking of installing NVidia FrameView utility to analyse the FPS in more scientific way than looking at the counter during games.
My only concern is if it will play nicely with AMD hardware.

I use Rivatuner statistics server to see frames while gaming. HWINFO can log this but for some reason it does not log all time. There are many empty lines left.

If anybody here knows which utility I can best use to log frames please let me know. I like to make diagrams like the reviewers are doing.


----------



## Robert Bourgoin (Feb 16, 2021)

I got an AMD XFX Radeon X6800, strictly stock. Love it. Was part of my new build in my signature.
I paid too much for it on Amazon but I wanted it for my build. Already a collectors item. LOL
Part of my nicest build. This system has been flawless from day one.


----------



## turbogear (Feb 16, 2021)

ratirt said:


> Dude. Your scores and OC is awesome. how did you manage to get 2685Mhz with only 980mv? Did you move the slider for power (15% up) for maximum as well?
> I understand the mem is maxed out to 2150Mhz. Have you validated your OC and you have no regression in FPS? I know that people claim a lot when OC but the cards perform worse than expected in games.


I did some data analysis with the 2685MHz@985mV overclock. 
You can see the performance is uplifted by OC.
In the Assassin's Creed Valhalla and Borderland benchmarks the increase is around 7% in performance.

Here are the plots comparing Superposition run at default and OC. I have also done Borderland 3 built in Benchmark and compared the default to my OC.
The plots compare the GPU frequency and the FPS data from Riva Tuner Server.

I discovered a small tool called GenericLogViewer that can help to compare HWINFO log files. 
LogViewer for HWINFO is available ! | HWiNFO Forum

Borderland 3 benchmark:
The gain is around 7% on average FPS.




Superposition benchmark:


Results from Assassin's Creed Valhalla.
The gain is around 7% on average FPS. 

With the 2685MHz@985mV:


With default GPU settings:


----------



## Felix123BU (Feb 19, 2021)

OK, so the 6800 XT, and whole system is under water 
As expected, made some rookie water build mistakes, as in did not think of how the tubes would  go, hence had to do some getto routing, but all good for a start.
The gpu cooling is insane with this setup, even after 1 hour furmark with the hottest setting I could figure out, it never went above 60 and 80 on the hotspot. General gaming is like max  50 and 70 on hotspot, the more demanding games.

Tried a 2.75ghz OC, it was running, but I am 99.9% sure it had not enough power for that clock. It needs moooar powaah 
Any of you have experience with the More Power Tool, as in, can you increase available power, and if yes, does it actually allow the card to draw more?
I had a nice experience long time ago with a rx 480, modded it and let is use 300w, basically double what it should have, I can only dream of what the 6800 XT could do with lets say a 450w limit


----------



## turbogear (Feb 19, 2021)

Felix123BU said:


> OK, so the 6800 XT, and whole system is under water
> As expected, made some rookie water build mistakes, as in did not think of how the tubes would  go, hence had to do some getto routing, but all good for a start.
> The gpu cooling is insane with this setup, even after 1 hour furmark with the hottest setting I could figure out, it never went above 60 and 80 on the hotspot. General gaming is like max  50 and 70 on hotspot, the more demanding games.
> 
> ...


Great that you got your first custom loop. 

The More Power tool is what I am using all the time and mentioned it lot of times here in the forum as MPT (More Power Tool). 
Before you try something if you don't remember it anymore then I would suggest to have a look at the this good article at Igor's Lab about using MPT:
https://www.igorslab.de/en/the-grea...ith-big-navi-and-the-morepower-tool-practice/

The setting that I am using in MPT can be seen above at post #118. 








						The RX 6000 series Owners' Club
					

New poster, and proud owner of a new 6800XT!  Just up picked a Power Color Reference model last week.  Had to pay £100 over the MSRP to secure it but well worth it in the end once I got it home, put it under water and started running some benchmarks.  I bought myself an EK Quantum-Vector Water...




					www.techpowerup.com
				




Just a piece of advice. As you have reference card with 2x8pin connectors, the maximum power draw should not be allowed to go above 375W.
Below you see the screen shot from More Power Tool with marked in yellow two field that one can modify. The above linked guide explains more.

The *Power Limit GPU* (default 255W) is the GPU core power only but not the total power of the GPU.
*TDC Limit GFX *(defualt 300A) is the current drawn by the GPU.
I would not advice to set it more than 290W here if you are also going to use the *Power Limit (%)* slider in the Radeon driver at 15%.
If you set here 290W you need to add to it 15% from Radeon settings which means 290x1.15=333.5W power to the GPU core only.

Now you need to add to that another 45W for other components in the GPU like the VRAM which means total power consumption of the GPU would be 45+333.5=378.5W. 
You see that with this you are on the limit of 375W allowed for 2x8pin cards.

Now this also tells us that we have maximum headroom of 375W-300W=75W maximum to play with.
We should not increase more than that. I actually don't plan to push my card more than 361W for daily use.
That is why I set 275W at Power Limit GPU field.

You should not go to 450W. There is a risk you burn your VRMs on the GPU. 
Only for some short Benchmark it could be okay but not when running something long term. 
If you had one of those cards that has 3x8pin then you would have 525W limit. (150W from each 8pin plus 75W from PCIe slot)


----------



## Felix123BU (Feb 19, 2021)

turbogear said:


> Great that you got your first custom loop.
> 
> The More Power tool is what I am using all the time and mentioned it lot of times here in the forum as MPT (More Power Tool).
> Before you try something if you don't remember it anymore then I would suggest to have a look at the this good article at Igor's Lab about using MPT:
> ...


Thank you for the info on limits, quite familiar with them, would assume those are the values you use.  And Igor is on my watchlist, funny guy, like him.
I did notice your posts on MPT, wanted to know if from your experience the power limits are actually raised, heard AMD does not really allow it even if you set other limits, they stay the same, hence my question. Did you notice the card drawing more or close to what you set?

And regarding the max drawn via the 2x8 pin power cables, they can draw quit a lot more if they are not utter crap quality and if the PSU can deliver, but yeah, the VRMs could get in trouble with a lot more power in   
I had a AMD RX 480 bios modded, and that drew up to 300w over a 1x6 pin power  cable, still works to this day in a friends PC, but I would not risk the same with the 6800XT, its a bit to expensive to play with it to close to the edge 

Oh, and I would sooo love to be able to modify the bios on this 6800 XT and be able to  write it to the card, AMD should cater to the enthusiasts, but sadly its just becoming a red Nvidia


----------



## turbogear (Feb 19, 2021)

Felix123BU said:


> Thank you for the info on limits, quite familiar with them, would assume those are the values you use.  And Igor is on my watchlist, funny guy, like him.
> I did notice your posts on MPT, wanted to know if from your experience the power limits are actually raised, heard AMD does not really allow it even if you set other limits, they stay the same, hence my question. Did you notice the card drawing more or close to what you set?
> 
> And regarding the max drawn via the 2x8 pin power cables, they can draw quit a lot more if they are not utter crap quality and if the PSU can deliver, but yeah, the VRMs could get in trouble with a lot more power in
> ...


I did some measurements with HWINFO on the power supply level some time ago on 6800XT.
I have Corsair HX1200i supply that has built in sensors that I can read through the HWINFO and I see higher power consumption.
I have evaluated these before with excel  to see the power increase on the GPU with overclocking like I wrote here:








						The RX 6000 series Owners' Club
					

Well, since Ampere has a club thread here, I figured why not a club for RDNA2 aka RX 6000 series? I guess there won't be that many of us, but I wanna stand up and counted as a proud owner of a 6000 series card. To get things started, here's mine, a Sapphire Nitro+ RX 6800 which I'd gotten a...




					www.techpowerup.com
				




I have not done that recently since I played a lot with setting in the last day but I have some log files recorded.
If I get the chance this weekend I can make some analysis and give the feedback here.

The bios moding may come at some time. At least there is a tool that allows us to flash the bios now for 6800XT. 
I have it at the moment but I am no expert in moding bioses.
I did that long time ago for mainboards, but have not done that since ages. 

This tool allows to save also the full bios 1024kB. 
GPU-Z allows to save only partial bios which has half the size 512kB. 
If somebody come with the idea of flashing one of the bioses from TPU database he can brick his card as bios file size is wrong.
That is why I would not like to share the tool here. 
I found it in another forum.


----------



## Felix123BU (Feb 19, 2021)

turbogear said:


> I did some measurements with HWINFO on the power supply level some time ago on 6800XT.
> I have Corsair HX1200i supply that has built in sensors that I can read through the HWINFO and I see higher power consumption.
> I have evaluated these before with excel  to see the power increase on the GPU with overclocking like I wrote here:
> 
> ...


I bios modded every card I could, but since this gen is sort of locked regarding bios writing, I am a bit uneasy about the idea of playing the odds. And its not only the fact that its not  a cheap card, even if I would kill it for good, which is unlikely, I could not get a replacement, the market is horrible for buyers of GPUs right now....
From experience, bricking the card via bad or wrong bioses is not such a big deal, if you have a minimal knowledge on the subject you can un-brick it rather easy, happened to me many times, not one card was permanently dead.
But, if AMD have a change of heart, which I highly doubt, and the modd community start playing around with it, count me in, hell yeah!


----------



## Caring1 (Feb 20, 2021)

Felix123BU said:


> OK, so the 6800 XT, and whole system is under water


I find the lack of fans disturbing, especially on the top radiator which should have the fans blowing through the radiator and out of the case.


----------



## Felix123BU (Feb 20, 2021)

Caring1 said:


> I find the lack of fans disturbing, especially on the top radiator which should have the fans blowing through the radiator and out of the case.


   they are only missing in the picture, but they are there, 2 fans on each radiator, its just that they are "above" the rads 
And no, I want them blowing air from outside the case inside the case, then exhaust it via case fan, case temp increase is basically irrelevant for me, speaking of 5-10 C max extra inside, but I can run the rad fans quieter this way, less need to ramp up
No way in hell would I run a radiator without a fan


----------



## GamerGuy (Feb 26, 2021)

It seems that although my RX 6900 XT is quite capable of playing COD MW Warzone at 3840x1080, very high to max ingame setting, with RT enabled, I'd just gotten a freeze (reported via AMD Bug Report). Anyone else having this issue? It seems the RT crash with Metro Exodus has more or less been resolved, RT with DOC MW WZ may be buggy, though I'd need to see if any others has the issue. I've disabled RT and the game seems to be fine though.


----------



## BENNiGaming (Mar 6, 2021)

Hi, everyone, I join the family!
This is my XFX 319 modded in white, hope you like it!

I couldn't stand that terrible MERC sign, it reminded me of that overused Mistral font in the 90s, so... 

Next step: install a RGB strip behind the Radeon logo bracket, because its "white" is a lilac purple, conflicting with almost every RGB theme I make... 

I didn't OC yet, without doing anything the TimeSpy is at 16352, I'll let you know the results!


----------



## turbogear (Mar 6, 2021)

@BENNiGaming
Welcome to the club that looks really nice.

My 6800XT reference card with waterblock is currently lying in the closet. 
It will go to eBay to be sold in the near future.
It was one of great performers for a 6800XT cards, but i was looking for even more performance with 3×8 pin version.

I have bought PowerColor 6900XT Red Devil. I am woking on tuning that one now to get maximum performance out of that. 
Water block for that Devil is ordered but will come end of March.


----------



## 68Olds (Mar 6, 2021)

BENNiGaming said:


> Hi, everyone, I join the family!
> This is my XFX 319 modded in white, hope you like it!
> 
> I couldn't stand that terrible MERC sign, it reminded me of that overused Mistral font in the 90s, so...
> ...


Congrats on the new card.  That's a really clean build and a nice mod with the white.


----------



## Felix123BU (Mar 6, 2021)

BENNiGaming said:


> Hi, everyone, I join the family!
> This is my XFX 319 modded in white, hope you like it!
> 
> I couldn't stand that terrible MERC sign, it reminded me of that overused Mistral font in the 90s, so...
> ...


"reminded me of that overused Mistral font in the 90s" that's what a graphic designer would say


----------



## turbogear (Mar 7, 2021)

So I am expermenting with my new PowerColor Red Devil 6900XT. 

2675MHz@1065mV is stable setting with MPT set to 330W and 384A TDC additionally with 15% power limit in Radeon Software. 
VRAM is set stable at 2120MHz with fast timing.

Time Spy Score:








						I scored 18 325 in Time Spy
					

AMD Ryzen 7 5800X, AMD Radeon RX 6900 XT x 1, 16384 MB, 64-bit Windows 10}




					www.3dmark.com
				




Port Royal Score:








						I scored 10 902 in Port Royal
					

AMD Ryzen 7 5800X, AMD Radeon RX 6900 XT x 1, 16384 MB, 64-bit Windows 10}




					www.3dmark.com
				




I am not able to go higher than 2675MHz without crash. 
I can set 2700MHz@1070mV where Time Spy is stable but not Port Royal.








						I scored 18 426 in Time Spy
					

AMD Ryzen 7 5800X, AMD Radeon RX 6900 XT x 1, 16384 MB, 64-bit Windows 10}




					www.3dmark.com
				




I tried higher and lower voltages but it seems 2675MHz is maximum I can achieve.

Alphacool waterblock is ordered but delivery is start of April.
I believe it is possible to get even higher performance ones watercooled.
Link if anybody else interested:








						Alphacool Eisblock Aurora Acryl GPX-A Radeon RX 6800(XT)/6900XT Red Devil mit Backplate
					

Der Alphacool Eisblock Aurora Acryl GPX Grafikkarten Wasserkühler mit Backplate vereint Style mit Performance. Extreme Kühlperformance und eine umfangreiche digital RGB Beleuchtung zeichnen ihn aus. Erfahrung und technisches Know-how aus...




					www.aquatuning.de
				




Setting in Radeon and MPT:


----------



## Felix123BU (Mar 7, 2021)

turbogear said:


> So I am expermenting with my new PowerColor Red Devil 6900XT.
> 
> 2675MHz@1065mV is stable setting with MPT set to 330W and 384A TDC additionally with 15% power limit in Radeon Software.
> VRAM is set stable at 2120MHz with fast timing.
> ...


Seems you get basically almost the same performance vs the good clocking 6800XT from before. The 6900XT might need moar power. The difference between them should be around 10%, and your highly OC-ed 6800XT was already above stock 6900XT. Same argument like the one between the 3800 and 3900, except that had a lot more Vram in the discussion. The 6900XT should have not really existed, or at least not at the 1k price-point    Greedy AMD


----------



## Hardcore Games (Mar 7, 2021)

I have to ask who pay more than MSRP for their RX 6000 series cards?


----------



## Felix123BU (Mar 7, 2021)

Hardcore Games said:


> I have to ask who pay more than MSRP for their RX 6000 series cards?


I was in a way lucky, I paid what would be normally the MSRP in my country for the 6800XT, but that was launch day, I would not pay todays prices under no circumstance.

I initially said, after 5 gens of AMD Gpus, I'll get an Nvidia this gen, the 3080 got out, could have bought it 5 times for a plus ranging from EUR 300 to 500 extra, I said no each time, then the 6800XT launch came, was MSRP, got it.

I will not now or ever buy at scalper prices


----------



## GerKNG (Mar 7, 2021)

my experience so far with a 6900XT.
broken pre alpha stuff as usual. glad i get my money back.


----------



## Felix123BU (Mar 7, 2021)

GerKNG said:


> my experience so far with a 6900XT.
> broken pre alpha stuff as usual. glad i get my money back.


That's odd, for me this is by far the most stable card and set of drivers I have ever had


----------



## turbogear (Mar 8, 2021)

Felix123BU said:


> Seems you get basically almost the same performance vs the good clocking 6800XT from before. The 6900XT might need moar power. The difference between them should be around 10%, and your highly OC-ed 6800XT was already above stock 6900XT. Same argument like the one between the 3800 and 3900, except that had a lot more Vram in the discussion. The 6900XT should have not really existed, or at least not at the 1k price-point   Greedy AMD


Indeed disappointing performance.
I expected much better and less power hungry.
You need to push it by about 65W to get 5% higher performance than my supper great efficient 6800XT. 

I am returning this card and keeping my 6800XT. 

6900XT experiment failed for me.


----------



## GamerGuy (Mar 8, 2021)

I'm happy with my Nitro+ RX 6900 XT, not a great OC'er, but it does what I need it to do.....handle games at max setting (no RT of course) with great framerates. Can't tell you guys how much fun it is to play all my games at max setting @ 3840x1080, and still net >100fps in a whole bunch of 'em. 

I've even tried Metro Exodus with RT at High and still net quite playable framerate, but to be honest, Metro Exodus doesn't need scoring high framerate for smooth gameplay. Now, I'm awaiting AMD to release their FidelityFX Super Resolution and see how much it's gonna help with games with RT.


----------



## Divide Overflow (Mar 8, 2021)

I picked up a XFX 6800 XT Speedster Merc 319 Black.
Running like a champ so far.


----------



## Hardcore Games (Mar 8, 2021)

I guess my RTX 2080 will be holding on pending a new card, pending, pending yo wuz up


----------



## Space Lynx (Mar 8, 2021)

GamerGuy said:


> I'm happy with my Nitro+ RX 6900 XT, not a great OC'er, but it does what I need it to do.....handle games at max setting (no RT of course) with great framerates. Can't tell you guys how much fun it is to play all my games at max setting @ 3840x1080, and still net >100fps in a whole bunch of 'em.
> 
> I've even tried Metro Exodus with RT at High and still net quite playable framerate, but to be honest, Metro Exodus doesn't need scoring high framerate for smooth gameplay. Now, I'm awaiting AMD to release their FidelityFX Super Resolution and see how much it's gonna help with games with RT.




how the heck did you get your hands on that lovely card?  just lucky like I was? I'm still a bit bummed I didn't get a 6800 XT, but thankful I got anything at all. I love my card, its been rock solid stable... I get around 2475mhz in all demanding games - and my hotspot only reaches 80 celsius, and thats rare and doesn't last long, actual gpu temp reading barely ever breaks 60-62 celsius... keep in mind I use an aggressive fan curve.


----------



## turbogear (Mar 8, 2021)

GerKNG said:


> my experience so far with a 6900XT.
> broken pre alpha stuff as usual. glad i get my money back.


Sorry to hear that.  
I guess you were just unlucky.
My 6800XT is rock stable with decent OC running since now two months.
I can't say much about 6900XT Red Devil. I had it only for 3days.
I am now returning it not because of any stability issues but because I was disappointed by its performance gains over the 6800XT.

I have a really good reference 6800XT but thought a 3x8pin higher binned Red Devil 6900XT could give me another 8-10% performance but I could only achieve 5% extra over 6800XT with 65W more power consumption. 

The plan was to sell 6800XT with high OC potential on eBay for a very good price but I guess I keep it but return 6900XT.


----------



## Gmr_Chick (Mar 8, 2021)

GamerGuy said:


> C'mon, guys, surely there're more than two RX 6000 series owners here.....



Well, gee, I'd *love* to have a 6000 (or even a 5000) series card, but I'm rather fond of my lungs, heart and kidneys, and poor, so I guess I'm stuck with my 1660 Super


----------



## Space Lynx (Mar 8, 2021)

Gmr_Chick said:


> Well, gee, I'd *love* to have a 6000 (or even a 5000) series card, but I'm rather fond of my lungs, heart and kidneys, and poor, so I guess I'm stuck with my 1660 Super




if you try to get a 6700 xt on launch day direct from amd.com store for 479 bucks, you can prob sell that 1660 super more than what you paid for it... i'd wait until the 6700 xt is in your hands though and don't pay for it more than msrp. looks like 1660 super is selling for 350 bucks on ebay.  so yeah would be a very nice upgrade... problem is can you get one on march 18th. the trick is don't click insanely fast. i just clicked every 4 seconds or so. for some reason it worked and i got my 6800 non-xt launch day. its worth a try. and if you succeed sell your 1660 super to cover the cost.


----------



## Gmr_Chick (Mar 8, 2021)

In my situation, I'd have to sell the 1660 Super first, and then hope I could get a 6700XT on launch day...


----------



## Space Lynx (Mar 8, 2021)

Gmr_Chick said:


> In my situation, I'd have to sell the 1660 Super first, and then hope I could get a 6700XT on launch day...



ah ok, yeah I wouldn't do that. cause most likely bots will get them all on March 18th.


----------



## turbogear (Mar 8, 2021)

Gmr_Chick said:


> Well, gee, I'd *love* to have a 6000 (or even a 5000) series card, but I'm rather fond of my lungs, heart and kidneys, and poor, so I guess I'm stuck with my 1660 Super


Well when I bought my 6800XT back in November, I paid 999€ for it, but I sold my Radeon VIII on eBay for crazy price of 700€.

Now when I bought 6900XT to replace 6800XT, the idea was to sell 6800XT on eBay  and looking at market price I was sure I can get 1200€ for it with fitted EK watercooler and being a special high OC card.

So 6900XT would had cost me around 350€ more with additional waterblock but now I will not keeping 6900XT and return it as performance is only 5% more than 6800XT with 65W more power consumption.


----------



## Space Lynx (Mar 8, 2021)

turbogear said:


> Well when I bought my 6800XT back in November, I paid 999€ for it, but I sold my Radeon VIII on eBay for crazy price of 700€.
> 
> Now when I bought 6900XT to replace 6800XT, the idea was to sell 6800XT on eBay  and looking at market price I was sure I can get 1200€ for it with fitted EK watercooler and being a special high OC card.
> 
> So 6900XT would had cost me around 350€ more with additional waterblock but now I will not keeping 6900XT and return it as performance is only 5% more than 6800XT with 65W more power consumption.




6800 XT is indeed the sweet spot, and a highly oc'd 6800 non-xt like mine... as I said I get around 2475 mhz in all demanding games and it seems pretty steady there with great temps, around 180 watts draw. not to shabby itself. 

I come within striking distance of a rtx 3090 in some AMD favored games... and tie or beat a 2080 ti across the board... all rock solid drivers. If you had told me even 6 months ago AMD was capable of this, I would have laughed, I love Ryzen, but the botched 5700 xt launch/driver issues left me very wary, in fact I had no intention of getting this card, but I was unable to get a rtx 3080 so I just went for it.  luckily it all worked out in the end


----------



## turbogear (Mar 8, 2021)

Gmr_Chick said:


> In my situation, I'd have to sell the 1660 Super first, and then hope I could get a 6700XT on launch day...


I tried it when 6800XT launched within 10 minutes of release.
I had it in cart and while paying with PayPal it was sold out and I was put on waiting list which I cancelled after some time as the chance of getting reference card was very low. 
The bots were much faster.  

I don't regret cancelling pre-order. I think the reference 6800XT never came back on Mindfactory after that time. 6900XT reference was a number of times in stock on their website but I did not see 6800XT reference again.

I bought my reference 6800XT later from a private seller on eBay who sold it as he got RTX3090.


----------



## Gmr_Chick (Mar 8, 2021)

Well, at this point, I'd take a Sapphire Nitro 5700XT and be perfectly happy with it. Shit, I might even be able to run Battlefront II in DX12 properly with it. I tried with my 1660 Super and...yeah, it failed badly, lol.


----------



## Felix123BU (Mar 8, 2021)

lynx29 said:


> 6800 XT is indeed the sweet spot, and a highly oc'd 6800 non-xt like mine... as I said I get around 2475 mhz in all demanding games and it seems pretty steady there with great temps, around 180 watts draw. not to shabby itself.
> 
> I come within striking distance of a rtx 3090 in some AMD favored games... and tie or beat a 2080 ti across the board... all rock solid drivers. If you had told me even 6 months ago AMD was capable of this, I would have laughed, I love Ryzen, but the botched 5700 xt launch/driver issues left me very wary, in fact I had no intention of getting this card, but I was unable to get a rtx 3080 so I just went for it.  luckily it all worked out in the end


Yup, and all 6800XT's I read about seem to easily achieve minimum 2600mhz OC's, basically making them the best buy of the 6000 series, ofc if they would be at or close to MSRP
I was also pleasantly surprised of how solid and stable the drivers are for a freshly launched GPU


----------



## kapone32 (Mar 8, 2021)

Well this was an interesting weekend and I learned much about supply for the 6000 series GPUs in Canada. I have been able to buy 3 different 6800XTs over the weekend. It is probably the same thing for people that have Micro center brick and mortar stores too. It is almost impossible to buy a GPU online. I was on kijji on Friday and did a search for 6800XT. On friday there were 6 listings, 5 of them were the Gigabyte Gaming OC, and 1 was actually the Asus LC 6900XT. On Saturday I went on Kijji and as the day went on the listing grew to 16. I sent out a few messages for the cards that were not stupidly overpriced. In one instance the seller told me that he would meet me in the parking lot  of the store (as they had confirmed his order was ready) for $1450 (CAD). The next one was $1475 (CAD) with the receipt and the card never having been opened. The one I settled on was $1440 (CAD) in an apartment 10 minutes up the road by EMT and password. 

In every instance the card was the Gigabyte Gaming OC 6800XT. In some cases the price on Kijji is $140-175 more than the retail price of 1199 + tax (1300). However on the weekend that exact card went up $50 at the same store so I am actually paying about $60 as a premium by having the card in my hand with a receipt.

It is obvious that a supply landed of GPUs. It would seem this was Gigabyte and the advantage of the brick and mortar are that they don't suffer from the point and click order problem that exists online. I guess you have to have a friend or family member working for the store to get yours though as the particular store has said that they do not take pre orders. 

So if you are bummed out about the prices on Ebay and the ever out of stock on Newegg, search Kijji you might be surprised in what you find.


----------



## sepheronx (Mar 8, 2021)

kapone32 said:


> Well this was an interesting weekend and I learned much about supply for the 6000 series GPUs in Canada. I have been able to buy 3 different 6800XTs over the weekend. It is probably the same thing for people that have Micro center brick and mortar stores too. It is almost impossible to buy a GPU online. I was on kijji on Friday and did a search for 6800XT. On friday there were 6 listings, 5 of them were the Gigabyte Gaming OC, and 1 was actually the Asus LC 6900XT. On Saturday I went on Kijji and as the day went on the listing grew to 16. I sent out a few messages for the cards that were not stupidly overpriced. In one instance the seller told me that he would meet me in the parking lot  of the store (as they had confirmed his order was ready) for $1450 (CAD). The next one was $1475 (CAD) with the receipt and the card never having been opened. The one I settled on was $1440 (CAD) in an apartment 10 minutes up the road by EMT and password.
> 
> In every instance the card was the Gigabyte Gaming OC 6800XT. In some cases the price on Kijji is $140-175 more than the retail price of 1199 + tax (1300). However on the weekend that exact card went up $50 at the same store so I am actually paying about $60 as a premium by having the card in my hand with a receipt.
> 
> ...



Memory express my dude.  If there is one in the area of you then it is that because it was about two saturdays ago that they got various 3070's and RX 6800 XT's, all were gigabyte.  Hence where I picked up two 3070's.


----------



## kapone32 (Mar 8, 2021)

sepheronx said:


> Memory express my dude.  If there is one in the area of you then it is that because it was about two saturdays ago that they got various 3070's and RX 6800 XT's, all were gigabyte.  Hence where I picked up two 3070's.


I didn't want to say it but it seems to be affecting Canada Computers too. The invoice I saw for one was Mar 4. So I guess Memory Express got their shipment before CC. I am not going to lie though I am quite happy to be able to actually get a card.


----------



## ChristTheGreat (Mar 8, 2021)

Gigabyte RX6800xt gaming OC owner here. Not dissapointed. The card is cool and powerfull.


----------



## turbogear (Mar 8, 2021)

ChristTheGreat said:


> Gigabyte RX6800xt gaming OC owner here. Not dissapointed. The card is cool and powerfull.


Welcome to the club. 
I have to agree that 6800XT are very powerful and run cool.
Good that I did not sell it. 

My 6900XT Red Devil was a disappointment. It was like a monster that you can feed infinite power to get it 4-5% faster than 6800XT.

With 380W on core and 384A TDC it gave Time Spy score 20600, where as 6800XT with 316W and 310A TDC achieves 19800 on Time Spy. 
That is only 4% faster with 64W more power draw. 
It is on its way back to online Shop.


----------



## ChristTheGreat (Mar 9, 2021)

haha!

Well I had a 3070, but a few mess with drivers, NVENC, and that 8gb of VRAM.

This card run in the second PC, 6800XT is now my main and it's just crazy. I do only play in 1080P, but I can fix my frame to 120,and keep the GPU quite cool. It will last long HAHAHA. I had a RX5700 before, and I liked it, like my RX580. I do not have any issue with AMD Driver since a long time. They're good!


----------



## GamerGuy (Mar 9, 2021)

ChristTheGreat said:


> haha!
> 
> Well I had a 3070, but a few mess with drivers, NVENC, and that 8gb of VRAM.
> 
> This card run in the second PC, 6800XT is now my main and it's just crazy. I do only play in 1080P, but I can fix my frame to 120,and keep the GPU quite cool. It will last long HAHAHA. I had a RX5700 before, and I liked it, like my RX580. I do not have any issue with AMD Driver since a long time. They're good!


I've been a user of both ATi/AMD and nVidia cards since the good old 9700 Pro and nVidia Riva TNT days, and have run both CF and SLi rigs through the years (last being 2x RX VEGA64 and 2x GTX Titan 6GB respectively). Never really had any big issue with either, minor gaming performance issue perhaps, but more or less pretty stable.

That's why when peeps complain about 'crappy' AMD drivers and how awesome nVidia drivers are, I'd scratch my head in befuddlement. Both may have minor issues that may affect performance, but I've yet to come across any that'd would require me to roll back to an earlier 'stable' driver since they're all pretty stable IIRC, only a question of performance in games.

BTW, both stoked and happy to see more and more getting into some RX 6000 series love.....in fact, it's looking more and more like an orgy of RX 6000 love!


----------



## Butanding1987 (Mar 9, 2021)

I got a Sapphire 6800 XT Nitro+ SE from Newegg a month ago at a price that was way higher than a reference 6900 XT because I'm from Asia (taxes and duties plus the fact that the concept of MSRP is totally alien to us).

I have since put it under water. This is my first custom loop. I think I watched a thousand YouTube videos to prepare myself for this project. LOL. Had I known I would be water cooling this, I would have gotten the non-SE version instead.

Anyway, here are some photos and benchmark results.


----------



## turbogear (Mar 9, 2021)

Butanding1987 said:


> I got a Sapphire 6800 XT Nitro+ SE from Newegg a month ago at a price that was way higher than a reference 6900 XT because I'm from Asia (taxes and duties plus the fact that the concept of MSRP is totally alien to us).
> 
> I have since put it under water. This is my first custom loop. I think I watched a thousand YouTube videos to prepare myself for this project. LOL. Had I known I would be water cooling this, I would have gotten the non-SE version instead.
> 
> Anyway, here are some photos and benchmark results.


 For your first ever water loop I have to say WOW that looks so great.
You can win beauty contest with that. 

I have been on water loops since many years but mine never looked that great.


----------



## Deleted member 205776 (Mar 9, 2021)

Thread making me regret my 3070 and how crappy my PC looks.


----------



## ChristTheGreat (Mar 9, 2021)

GamerGuy said:


> I've been a user of both ATi/AMD and nVidia cards since the good old 9700 Pro and nVidia Riva TNT days, and have run both CF and SLi rigs through the years (last being 2x RX VEGA64 and 2x GTX Titan 6GB respectively). Never really had any big issue with either, minor gaming performance issue perhaps, but more or less pretty stable.
> 
> That's why when peeps complain about 'crappy' AMD drivers and how awesome nVidia drivers are, I'd scratch my head in befuddlement. Both may have minor issues that may affect performance, but I've yet to come across any that'd would require me to roll back to an earlier 'stable' driver since they're all pretty stable IIRC, only a question of performance in games.
> 
> BTW, both stoked and happy to see more and more getting into some RX 6000 series love.....in fact, it's looking more and more like an orgy of RX 6000 love!



Well, AMD is back, and it's powerfull. I don't give a sh*t about ray tarcing right now and with Sony wating more exlusive being ported to PC, and console using the RDNA2, I just think it's a good thing for us.


----------



## Felix123BU (Mar 9, 2021)

ChristTheGreat said:


> Well, AMD is back, and it's powerfull. I don't give a sh*t about ray tarcing right now and with Sony wating more exlusive being ported to PC, and console using the RDNA2, I just think it's a good thing for us.


That's almost always the case, at launch the AMD gpu's have 85% of their total performance, then in 1-2 years they recover it via driver improvements, almost always has been the case, the 580 vs 1060, look at the 580 today, the V64 vs the 1080, look at V64 today, the 5700XT vs the 2070S, look at 5700XT today. And the new gen consoles being RDNA2 should also help. Not saying Nvidia are not good, they are almost always a bit better at launch in comparable tiers, but then things kind of turn around. I tend to keep my GPU around 3 years, so for me either AMD or NVidia are fine, with the bonus that AMD might be better in 1-2 years time


----------



## Butanding1987 (Mar 10, 2021)

turbogear said:


> For your first ever water loop I have to say WOW that looks so great.
> You can win beauty contest with that.
> 
> I have been on water loops since many years but mine never looked that great.


Thank you, YouTube is a great teacher.


----------



## nguyen (Mar 10, 2021)

ChristTheGreat said:


> Well, AMD is back, and it's powerfull. I don't give a sh*t about ray tarcing right now and with Sony wating more exlusive being ported to PC, and console using the RDNA2, I just think it's a good thing for us.



Sony porting games to PC doesn't mean those games will favor RDNA2 though. FF XV, Monster Hunter: World, Death Stranding, Nioh 2 are supporting DLSS and you will see more ported games supporting DLSS coming out. Well maybe you don't give a crap about DLSS either


----------



## turbogear (Mar 10, 2021)

turbogear said:


> For your first ever water loop I have to say WOW that looks so great.
> You can win beauty contest with that.
> 
> I have been on water loops since many years but mine never looked that great.


By the way what settings are you using on you 6800XT for the OC?
Your Portal Royal results are a little better but Time Spy  a little less than mine.  

My settings, 2675MHz@985mV with MPT set to 275W, 310A TDC and additionally 15% power limit in Radeon Software. My VRAM though can only operate stable at 2100MHz with fast timing. The VRAM does not like higher frequencies on my sample.
Here are my results in details:








						The RX 6000 series Owners' Club
					

New poster, and proud owner of a new 6800XT!  Just up picked a Power Color Reference model last week.  Had to pay £100 over the MSRP to secure it but well worth it in the end once I got it home, put it under water and started running some benchmarks.  I bought myself an EK Quantum-Vector Water...




					www.techpowerup.com
				




I increased 5mV to the setting posted in the link to get higher score in Port Royal Stress Test from 99.3% to 99.7%.  

I got Time Spy score in your range before when I had lower setting of  2650MHz@970mV as posted in the link above.


----------



## ratirt (Mar 10, 2021)

nguyen said:


> Well maybe you don't give a crap about DLSS either


Why would he? He's got RX 6800XT.


turbogear said:


> My settings, 2675MHz@985mV with MPT set to 275W, 310A TDC and additionally 15% power limit in Radeon Software. My VRAM though can only operate stable at 2100MHz with fast timing. The VRAM does not like higher frequencies on my sample.





turbogear said:


> I increased 5mV to the setting posted in the link to get higher score in Port Royal Stress Test from 99.3% to 99.7%.
> 
> I got Time Spy score in your range before when I had lower setting of 2650MHz@970mV as posted in the link above.


I see you went with some optimizations lately. Considering your voltage. I'm still amazed that it's set to 970mv? i know it will probably ramp up when needed but that's still nice. I don't have time to play with my card at this point. Too much work. I'll try over the weekend when I get my new monitor. 
Been thinking about getting better PSU since my current one seems a little bit short to try things. Maybe I just go Undervolt and see where I can get with this but it would seem I can't go below 1V no matter what.


----------



## turbogear (Mar 10, 2021)

ratirt said:


> Why would he? He's got RX 6800XT.
> 
> 
> I see you went with some optimizations lately. Considering your voltage. I'm still amazed that it's set to 970mv? i know it will probably ramp up when needed but that's still nice. I don't have time to play with my card at this point. Too much work. I'll try over the weekend when I get my new monitor.
> Been thinking about getting better PSU since my current one seems a little bit short to try things. Maybe I just go Undervolt and see where I can get with this but it would seem I can't go below 1V no matter what.


970mV also works with 2650MHz setting, but I am using currently 985mV for 2675MHz setting. This gives about ~130 points more in Time Spy. 

By the way I also had last week a 6900XT Red Devil to play with for some days. 
Now it has been sent back as the performance was not too satisfying for me compared to my 6800XT that overclocks really great. 

You will be disappointed that the 6900XT does not allow to go that low with voltage.
The maximum I could go with the 6900XT Red Devil was 1065mV with frequency set to 2675MHz.
I could not get Red Devil run stable for anything higher. I had to really push the power high with MPT on the Red Devil to get only 4-5% higher performance than my 6800XT.
You can read about my experiment with the Red Devil 6900XT here:








						The RX 6000 series Owners' Club
					

OK, so the 6800 XT, and whole system is under water :D As expected, made some rookie water build mistakes, as in did not think of how the tubes would  go, hence had to do some getto routing, but all good for a start. The gpu cooling is insane with this setup, even after 1 hour furmark with the...




					www.techpowerup.com


----------



## ratirt (Mar 10, 2021)

turbogear said:


> 970mV also works with 2650MHz setting, but I am using currently 985mV for 2675MHz setting. This gives about ~130 points more in Time Spy.
> 
> By the way I also had last week a 6900XT Red Devil to play with for some days.
> Now it has been sent back as the performance was not too satisfying for me compared to my 6800XT that overclocks really great.
> ...


Now it makes me think if your 6900XT was lower quality or if this situation applies to all 6900XT Red Devils. Considering that I have exact one you sir made me worried I won't be able to pull of some decent OC. If I change my PSU that is. The 750W I got now makes me think it is definitely not enough for a long term OC. I really hope you are wrong with the 6900xt Red Devil and it was your card's misfortune in OC only. 
problem is, it is hard to get any decent PSu at the moment and if I could get one the prices are astronomical. The other one is, my CPU can't keep up with the card while benching at 1080p. I would need to get some 4k bench that would utilize GPU fully and give the CPU a heave ho. Otherwise the score will be a total hokum not anything representative.


----------



## turbogear (Mar 10, 2021)

ratirt said:


> Now it makes me think if your 6900XT was lower quality or if this situation applies to all 6900XT Red Devils. Considering that I have exact one you sir made me worried I won't be able to pull of some decent OC. If I change my PSU that is. The 750W I got now makes me think it is definitely not enough for a long term OC. I really hope you are wrong with the 6900xt Red Devil and it was your card's misfortune in OC only.
> problem is, it is hard to get any decent PSu at the moment and if I could get one the prices are astronomical. The other one is, my CPU can't keep up with the card while benching at 1080p. I would need to get some 4k bench that would utilize GPU fully and give the CPU a heave ho. Otherwise the score will be a total hokum not anything representative.



Well my Red Devil was not the best.
There are some out there which are better than mine, but still as far as I read the undervolt range is not that high as 6800XT.
Some users of Red Devil 6900XT have a thread going at Igor's Lab. It is a German thread though.
Here is the link to that thread.





						AMD - PowerColor Radeon RX 6900 XT Red Devil
					

Hallo in die Runde! Nachdem ich mir jetzt eine RX 6900 XT gegönnt habe und regelmäßig Igor´s Videos schaue, habe ich jetzt auch versucht meine Grafikkarte "zu prügeln"! Als absoluter OC Anfänger habe ich versucht mich vorsichtig so wie ich glaube es in den Videos verstanden zu haben nach vorne...




					www.igorslab.de
				




1040-1060mV voltage range is possible with Red Devil, but as you will also see there you need to increase the power target a lot to get good performance.
You will see there people use in range of 400W  on the GPU Core only (Total GPU power of ~450W) and 384A TDC  to achieve Time Spy score in range of 20800.
My sample was not able to go much higher than 20600 on Time Spy score though I tried 400W and 384A TDC.

Because my 6800XT can achieve up to 20K Time Spy score, for me it did not make sense to replace 6800XT with 6900XT for only 4-5% maximum performance gain because the power consumption is much higher to achieve this.

You will need a good PSU if you want to push the Red Devil, as said the power target need to be increased from 281W default to around 400W on core to achieve high performance.
I have a 1200W PSU so for me that was not a problem.


----------



## ratirt (Mar 10, 2021)

turbogear said:


> 1040-1060mV voltage range is possible with Red Devil, but as you will also see there you need to increase the power target a lot to get good performance.
> You will see there people use in range of 400W  on the GPU Core only (Total GPU power of ~450W) and 384A TDC  to achieve Time Spy score in range of 20800.
> My sample was not able to go much higher than 20600 on Time Spy score though I tried 400W and 384A TDC.
> 
> ...


I got mine set to 1000mv and it was working as intended but no OC or maybe slight OC I'm not quite sure now since I haven't played with it for a while now. 
400W that's a lot an a killer to my PSU. I would need to look for a better PSU until then, I give the OC no go and focus on undervolting. 
I know the 6800XT is a great overclocker and 6900XT's price and performance sucks but I got this one so I'm stuck with it. It's not a bad card anyway and there's still headroom for OC. Maybe small but still is.


----------



## turbogear (Mar 10, 2021)

ratirt said:


> I got mine set to 1000mv and it was working as intended but no OC or maybe slight OC I'm not quite sure now since I haven't played with it for a while now.
> 400W that's a lot an a killer to my PSU. I would need to look for a better PSU until then, I give the OC no go and focus on undervolting.
> I know the 6800XT is a great overclocker and 6900XT's price and performance sucks but I got this one so I'm stuck with it. It's not a bad card anyway and there's still headroom for OC. Maybe small but still is.


If you can undervolt it to 1000mV that is a good indication that hopefully you have one of those really great cards. 
Not many 6900XT are stable that low. Many cannot even go lower than 1100mV. Mine was instable below 1050mV.

The lower the voltage that you can set the less power it will need to have a reasonable OC headroom.
You can try in the Radeon Setting auto overclock.
This will set automatically a frequency and gives good indication how high you can overclock.

With my Red Devil Auto overclock set frequency to 2580MHz and I was able to achieve 2675MHz.
If Radeon gives you similar value that would indicate that most probably you could go in range of 2700MHz or even higher.
The 1000mV will help you to get the high performances without going crazy on the power consumption.


----------



## ChristTheGreat (Mar 10, 2021)

nguyen said:


> Sony porting games to PC doesn't mean those games will favor RDNA2 though. FF XV, Monster Hunter: World, Death Stranding, Nioh 2 are supporting DLSS and you will see more ported games supporting DLSS coming out. Well maybe you don't give a crap about DLSS either




Well, maybe we don't know. If dev port game correctly, it could take advantage.

DLSS, no I don't care. I play 1080p high frames, and won't change until a long time.

AMD has fidelityfx, which works on a couple of Games also.

Anyway, I thought my return to nVidia would be nice, and I was quite dissapointed so any nVidia tech, I dont care right now


----------



## Butanding1987 (Mar 10, 2021)

turbogear said:


> By the way what settings are you using on you 6800XT for the OC?
> Your Portal Royal results are a little better but Time Spy  a little less than mine.
> 
> My settings, 2675MHz@985mV with MPT set to 275W, 310A TDC and additionally 15% power limit in Radeon Software. My VRAM though can only operate stable at 2100MHz with fast timing. The VRAM does not like higher frequencies on my sample.
> ...


I don't remember the settings for that Port Royal result. I've been unable to reproduce it. The most I can get now is 10,156 to 10,178 using the settings below. I usually get higher scores upon booting in the morning. I also find that I can usually get away with a higher GPU and memory clock by undervolting instead of increasing the voltage.

As a side note, my GPU and junction temperatures reached as high as 90 C and 113 C, respectively when running Port Royal using the built-in fans. With a custom loop, the maximum temperatures are now in the 50s and 60s, respectively. Maybe 72 C tops when looping.


----------



## Athlonite (Mar 10, 2021)

Gmr_Chick said:


> Well, at this point, I'd take a Sapphire Nitro 5700XT and be perfectly happy with it. Shit, I might even be able to run Battlefront II in DX12 properly with it. I tried with my 1660 Super and...yeah, it failed badly, lol.



I'd offer to buy you an RX6800 non XT here but shipping to you would kill any savings you'd make just buying one where you live 
Powercolor PowerColor Red Dragon Radeon RX 6800 - Graphics card 16GB G $1,224.66 ( $858.24USD) + $???.?? shipping and don't forget Duty


----------



## kapone32 (Mar 10, 2021)

nguyen said:


> Sony porting games to PC doesn't mean those games will favor RDNA2 though. FF XV, Monster Hunter: World, Death Stranding, Nioh 2 are supporting DLSS and you will see more ported games supporting DLSS coming out. Well maybe you don't give a crap about DLSS either


Yeah because the PS has been using Intel parts for years. DLSS really. My TV provider uses a a form of DLSS to have their 540P output to use AI to upscale to 1080P on your TV. A game changer indeed.


----------



## Felix123BU (Mar 10, 2021)

I am still of the opinion that DLSS is a pure gimmick and marketing feature. I just can not see Nvidia making DLSS what it should be, as in be in almost all games, that would make it something else than a gimmick.

Why would they do that? So that a 3050 can have the same experience as a 3080? So you would not need to buy a 3080 if a supposed 3050 can have the same experience? In what universe would Nvidia want that? 

One thing I will always praise though is Nvidia's marketing, its sooo good, they managed to make DLSS look like the holy grail and all are buying into it  and almost all are forgetting it was just done to hide the crappy Raytracing performance, not because Nvidia is your friend. And AMD is playing the game aswell, just as usual late to the party, and with horrendous naming, Fidelity FX Super Resolution, come on, really nothing better came up for this?


----------



## nguyen (Mar 10, 2021)

kapone32 said:


> Yeah because the PS has been using Intel parts for years. DLSS really. My TV provider uses a a form of DLSS to have their 540P output to use AI to upscale to 1080P on your TV. A game changer indeed.



Well if you think AI upscaling a video games where users make inputs that create different outcomes vs upscaling a movie is the same then yeah


----------



## Butanding1987 (Mar 10, 2021)

Felix123BU said:


> I am still of the opinion that DLSS is a pure gimmick and marketing feature. I just can not see Nvidia making DLSS what it should be, as in be in almost all games, that would make it something else than a gimmick.
> 
> Why would they do that? So that a 3050 can have the same experience as a 3080? So you would not need to buy a 3080 if a supposed 3050 can have the same experience? In what universe would Nvidia want that?
> 
> One thing I will always praise though is Nvidia's marketing, its sooo good, they managed to make DLSS look like the holy grail and all are buying into it  and almost all are forgetting it was just done to hide the crappy Raytracing performance, not because Nvidia is your friend. And AMD is playing the game aswell, just as usual late to the party, and with horrendous naming, Fidelity FX Super Resolution, come on, really nothing better came up for this?


RedGamingTech over on YouTube claims Super Resolution will double the performance and is backward compatible with 5000 series cards.


----------



## kapone32 (Mar 10, 2021)

Felix123BU said:


> I am still of the opinion that DLSS is a pure gimmick and marketing feature. I just can not see Nvidia making DLSS what it should be, as in be in almost all games, that would make it something else than a gimmick.
> 
> Why would they do that? So that a 3050 can have the same experience as a 3080? So you would not need to buy a 3080 if a supposed 3050 can have the same experience? In what universe would Nvidia want that?
> 
> One thing I will always praise though is Nvidia's marketing, its sooo good, they managed to make DLSS look like the holy grail and all are buying into it  and almost all are forgetting it was just done to hide the crappy Raytracing performance, not because Nvidia is your friend. And AMD is playing the game aswell, just as usual late to the party, and with horrendous naming, Fidelity FX Super Resolution, come on, really nothing better came up for this?


I know exactly what you mean the should have called it FreelSS



nguyen said:


> Well if you think AI upscaling a video games where users make inputs that create different outcomes vs upscaling a movie is the same then yeah


It doesn't matter I have no intention of doing what turning down my settings would do anyway. but since this is a 6800XT owner's thread why are you as an owner of a 3090 creeping on our Garden?


----------



## turbogear (Mar 10, 2021)

Butanding1987 said:


> I don't remember the settings for that Port Royal result. I've been unable to reproduce it. The most I can get now is 10,156 to 10,178 using the settings below. I usually get higher scores upon booting in the morning. I also find that I can usually get away with a higher GPU and memory clock by undervolting instead of increasing the voltage.
> 
> As a side note, my GPU and junction temperatures reached as high as 90 C and 113 C, respectively when running Port Royal using the built-in fans. With a custom loop, the maximum temperatures are now in the 50s and 60s, respectively. Maybe 72 C tops when looping.


Your settings are very similar to mine but less performance than mine can be explained due to the fact that you need 1010mV and mine works as low as 985mV.  
As you have Sapphire Nitro+, this means the power target is already set to 289W by default so you did not need MPT but you could consider increasing TDC a little to get more performance.

Therefore you only need to undervolt your GPU to get higher performance.  

I have reference design with power target set to 255W by default and I use additionally MPT to set the GPU Core power target to 275W and 310A TDC in addition to the settings in Radeon Software.

According the bios info page from TPU, the Nitro+ has 289W core and TDC at 300A. I would have tried to increase it a little like for example 310A and it could give you little bit more performance.








						Sapphire RX 6800 XT VBIOS
					

16 GB GDDR6, 500 MHz GPU, 2000 MHz Memory




					www.techpowerup.com
				




You also mentioned that you get better performance when you turn on the computer. That is a little strange with water cooler. I had this before with original cooler as after a few runs temperatures where higher and performance then dropped but I don't notice something like that since I have water cooler.


----------



## Butanding1987 (Mar 10, 2021)

turbogear said:


> Your settings are very similar to mine but less performance than mine can be explained due to the fact that you need 1010mV and mine works as low as 985mV.
> As you have Sapphire Nitro+, this means the power target is already set to 289W by default so you did not need MPT but you could consider increasing TDC a little to get more performance.
> 
> Therefore you only need to undervolt your GPU to get higher performance.
> ...


Speaking of undervolting, I tried 985mV in Port Royal just now and it passed. I'm not sure how stable this is, though.


----------



## Felix123BU (Mar 10, 2021)

turbogear said:


> Your settings are very similar to mine but less performance than mine can be explained due to the fact that you need 1010mV and mine works as low as 985mV.
> As you have Sapphire Nitro+, this means the power target is already set to 289W by default so you did not need MPT but you could consider increasing TDC a little to get more performance.
> 
> Therefore you only need to undervolt your GPU to get higher performance.
> ...


I am not sure how much more power the 6800XT can take, past cards where over-engineered and could safely take a lot more, this one I am not sure, the 15% max power slider does not give me great confidence in really maxxing it out. I don't want to fry a "650" USD card, also because there is no replacement to be had   

Any of you know of a review of the PCB and power delivery system for the 6800XT? I have not found one that goes in depth on how much the card's power delivery system can take.


----------



## turbogear (Mar 10, 2021)

Felix123BU said:


> I am not sure how much more power the 6800XT can take, past cards where over-engineered and could safely take a lot more, this one I am not sure, the 15% max power slider does not give me great confidence in really maxxing it out. I don't want to fry a "650" USD card, also because there is no replacement to be had
> 
> Any of you know of a review of the PCB and power delivery system for the 6800XT? I have not found one that goes in depth on how much the card's power delivery system can take.


The only explanation about the power delivery stage of 6800XT I know about is the video of Buildzoid about 6800XT Red Devil. 
Red Devil has I think three more phases for core power in comparison to reference design but the power stages used are the same as reference design.
The reference design has 10 phase power supply for VCore and Red Devil has 14 phase if I remember correctly.

Each phase is capable to supply 70A and uses Infineon TDA21472 as power stages. So in theory it is capable of 10x70A=700A but I think you would need LN to cool it then. 
The more we draw from the power stages the higher the power dissipation increases i.e. the heat output of the power stage and harder it gets to cool them down, but with our water loop we have higher potential.  










If I get time in the near future I will try to calculate the power dissipation for certain amount of load. 
As engineer I have the technical background but need some time to read into this DC-DC convertor (step-down buck convertor) worst case calculation topics.



Butanding1987 said:


> Speaking of undervolting, I tried 985mV in Port Royal just now and it passed. I'm not sure how stable this is, though.


Very nice.  
You can run Port Royal Stress test.
It is really nasty and if the GPU is not stable, you will start seeing artifacts after some loops of run.


----------



## ratirt (Mar 10, 2021)

turbogear said:


> The only explanation about the power delivery stage of 6800XT I know about is the video of Buildzoid about 6800XT Red Devil.
> Red Devil has I think three more phases for core power in comparison to reference design but the power stages used are the same as reference design.
> The reference design has 11 phase power supply for VCore and Red Devil has 14 phase if I remember correctly.
> 
> ...


I found this 




__
		https://www.reddit.com/r/Amd/comments/koeudw
 about the 6900XT. 
There isn't a lot of 6900xt reviews. 
Dude says. 1100mv and 2700Mhz core clock. That's nothing spectacular. AMD locked all the card to basically reach 2700Mhz and that's about it.


----------



## Felix123BU (Mar 10, 2021)

turbogear said:


> The only explanation about the power delivery stage of 6800XT I know about is the video of Buildzoid about 6800XT Red Devil.
> Red Devil has I think three more phases for core power in comparison to reference design but the power stages used are the same as reference design.
> The reference design has 11 phase power supply for VCore and Red Devil has 14 phase if I remember correctly.
> 
> ...


Thanks, that is helpful, so according to that, theoretically the 6800XT should still be able to take in double at what its set  I don't really care that much about efficiency for testing purposes.
I found myself browsing the Buildzoid channel for a long time in the hope of seeing him review a reference 6800XT PCB, or at least a 6900XT ref. No luck, I guess he hates the 6000 series because of its artificially imposed limits, which is a pity.


----------



## turbogear (Mar 10, 2021)

ratirt said:


> I found this
> 
> 
> 
> ...


I know this reddit page, I also had a deep dive onto it when I had 6900XT to play with. 



Felix123BU said:


> Thanks, that is helpful, so according to that, theoretically the 6800XT should still be able to take in double at what its set  I don't really care that much about efficiency for testing purposes.
> I found myself browsing the Buildzoid channel for a long time in the hope of seeing him review a reference 6800XT PCB, or at least a 6900XT ref. No luck, I guess he hates the 6000 series because of its artificially imposed limits, which is a pity.


Yes Buildzoid hates AMD because they limited the frequency and voltage on 6000 series.


----------



## Felix123BU (Mar 10, 2021)

ratirt said:


> I found this
> 
> 
> 
> ...


"AMD locked all the card to basically reach 2700Mhz and that's about it."

That's also my impression, the 6800XT and 6900XT are basically 2600+ Mhz cards, and that is also the reason the stock frequencies are what they are, quite low. That's also the reason why some here did not see a big difference between the 6800XT and the 6900XT, because they both clock basically the same at the upper end. Had AMD clocked them to what they really are, 2600 Mhz cards, that would have basically negated the performance and price difference between them.  



turbogear said:


> I know this reddit page, I also had a deep dive onto it when I had 6900XT to play with.
> 
> 
> Yes Buildzoid hates AMD because they limited the frequency and voltage on 6000 series.


I don't hate AMD for that, I just find it a bit sad, AMD gpu's where traditionally very fun to thinker whit because the limits where more relaxed. No more, you now get a locked card with kick-ass performance only


----------



## ratirt (Mar 10, 2021)

I wonder if at some point, there will be bios mods that would allow you to unlock the card fully. I would love to try 6900XT past the barriers AMD has put in with voltage, power etc. Not to mention the memory frequency lock.


----------



## turbogear (Mar 10, 2021)

ratirt said:


> I wonder if at some point, there will be bios mods that would allow you to unlock the card fully. I would love to try 6900XT past the barriers AMD has put in with voltage, power etc. Not to mention the memory frequency lock.


It is not only frequency and power limit that does not allow higher performance. In many 6900XT and 6800XT it is actually voltage.

Mine 6800XT and 6900XT both could go only to 2640MHz real frequency so still there was headroom until 2700MHz limit. Additionally also both had headroom for me to apply higher power but extra power and higher frequency setting did not help and setting were unstable.
They would need higher voltage to go higher in frequency.

Maybe there is some reason to not allow higher voltages?
I know some SoCs from my work where power supply tolerance are limited to +/-5% on Vcore of 1V.

With the memory there are actually lot of cards out there that even cannot achieve 2150MHz. It is dependent on memory quality.
The Red Devil 6900XT that I tested was memory unstable above 2120MHz. My 6800XT becomes unstable above 2100MHz.
So on both of these cards it does not help if AMD removed the limit of 2150MHz. It would not work anyway.
Maybe they limited memory as they got information from Samsung that these memory units are not allowed to go too high.


----------



## Felix123BU (Mar 10, 2021)

turbogear said:


> It is not only frequency and power limit that does not allow higher performance. In many 6900XT and 6800XT it is actually voltage.
> 
> Mine 6800XT and 6900XT both could go only to 2640MHz real frequency so still there was headroom until 2700MHz limit. Additionally also both had headroom for me to apply higher power but extra power and higher frequency setting did not help and setting were unstable.
> They would need higher voltage to go higher in frequency.
> ...


Those are some good points, namely max voltage allowed and memory speeds, which seem quite conservative. I still dream of a 6800XT or 6900XT with HBM on it, maybe AMD makes one one day, would be fun to see


----------



## Sluppermand (Mar 10, 2021)

Finaly I were able to pick up my computer from the repair shop, so now I'm part of this club aswell.




UserBenchmarks: Game 177%, Desk 95%, Work 165%
It says I should ensure I'm running dual channel, I searched as much as I could, but I really can't find the setting in the bios. It's running 3600mhz 14CL as it's supposed to.
However, the Passmark doesn't say anything, and the score is decent, so I believe it's just the Userbenchmark program that has a problem.

It's all stock. I won't be touching anything until I get my 5900X, which sadly, seems to be a few months away according to the latest intel.

With 100% load on the CPU, it became 72 degrees celsius (161 degrees fahrenheit), I'm running a 3600X with a be quiet Dark Rock Pro 4 cooler. Is that normal? Mostly just wandering if I added the correct amount of paste.


----------



## Gmr_Chick (Mar 11, 2021)

Athlonite said:


> I'd offer to buy you an RX6800 non XT here but shipping to you would kill any savings you'd make just buying one where you live
> Powercolor PowerColor Red Dragon Radeon RX 6800 - Graphics card 16GB G $1,224.66 ( $858.24USD) + $???.?? shipping and don't forget Duty



Thank you for the offer regardless though


----------



## ratirt (Mar 11, 2021)

turbogear said:


> It is not only frequency and power limit that does not allow higher performance. In many 6900XT and 6800XT it is actually voltage.
> 
> Mine 6800XT and 6900XT both could go only to 2640MHz real frequency so still there was headroom until 2700MHz limit. Additionally also both had headroom for me to apply higher power but extra power and higher frequency setting did not help and setting were unstable.
> They would need higher voltage to go higher in frequency.
> ...


Maybe you are right but on the other hand what you are saying is all speculation and looking for a reason why.
If there is a reason, it might be totally different than memory limitations or SoC limitations. 
Sooner or later, somebody will come up with a unlocked bios or mod the board and hopefully we will be able to know, how much more you can get from these cards. 
Maybe AMD locked them so that these wont exceed the power consumption too much and go overboard with it. This is valid since these cards do not gobble that much power. efficiency would be gone otherwise.


----------



## turbogear (Mar 11, 2021)

ratirt said:


> Maybe you are right but on the other hand what you are saying is all speculation and looking for a reason why.
> If there is a reason, it might be totally different than memory limitations or SoC limitations.
> Sooner or later, somebody will come up with a unlocked bios or mod the board and hopefully we will be able to know, how much more you can get from these cards.
> Maybe AMD locked them so that these wont exceed the power consumption too much and go overboard with it. This is valid since these cards do not gobble that much power. efficiency would be gone otherwise.


On the second thought about voltage, as 6800XT uses same 7nn technology as 6900XT in theory it should be able to go higher on voltages like 6900XT where we have 1175mV vs 1150mV so it would be interesting if somebody can unlock 6800XT voltage slider to accept settings like 6900XT.

On the VRAM as I mentioned last time, I have read lot of forums and see that lot of 6800XT and 6900XT out there cannot even reach 2150MHz so there I wonder about the quality of Samsung memory chips.
Maybe also for memory higher voltage setting will help.
I played yesterday again with memory frequency settings and I can confirm again that on my 6800XT if I set VRAM frequency higher than 2100MHz the Time Spy Score goes down by about 200 points already for 2120MHz setting.
For VRAM setting of 2150MHz Time Spy goes down to 19200 range from 19800 range at 2100MHz.  

On Igor's Lab forum there is the RED BIOS REBELLION group that are already working on unlocking the bios. Until now they were not successful but maybe in the very near future. 
This also the group behind the creation of MPT as well.


----------



## ratirt (Mar 11, 2021)

turbogear said:


> On the second thought about voltage, as 6800XT uses same 7nn technology as 6900XT in theory it should be able to go higher on voltages like 6900XT where we have 1175mV vs 1150mV so it would be interesting if somebody can unlock 6800XT voltage slider to accept settings like 6900XT.
> 
> On the VRAM as I mentioned last time, I have read lot of forums and see that lot of 6800XT and 6900XT out there cannot even reach 2150MHz so there I wonder about the quality of Samsung memory chips.
> Maybe also for memory higher voltage setting will help.
> ...


I don't doubt the voltage block but I'm not sure why AMD has locked the cards so severely. 
If the 6800Xt could hit voltage limit like the 6900xt, the 6800xt would have clocked higher than the 6900xt for sure if, of course, the frequency was not locked as well. 
I think the biggest problem is the ram here. AMD locked the chip due to the memory that is not the best and if everything would have been unlocked, the memory would have been the problem. 

So basically, higher frequency on a mem 2150 score goes down. I think the memory is the bottleneck here and AMD locked the cpus because of this.


----------



## GamerGuy (Mar 11, 2021)

Sluppermand said:


> Finally I were able to pick up my computer from the repair shop, so now I'm part of this club as well.


Welcome aboard the RX 6000 bandwagon!


----------



## turbogear (Mar 11, 2021)

ratirt said:


> I don't doubt the voltage block but I'm not sure why AMD has locked the cards so severely.
> If the 6800Xt could hit voltage limit like the 6900xt, the 6800xt would have clocked higher than the 6900xt for sure if, of course, the frequency was not locked as well.
> I think the biggest problem is the ram here. AMD locked the chip due to the memory that is not the best and if everything would have been unlocked, the memory would have been the problem.
> 
> So basically, higher frequency on a mem 2150 score goes down. I think the memory is the bottleneck here and AMD locked the cpus because of this.


So when one looks at the configurations of the 6900XT Toxic where Sapphire claims to be the fastest 6900XT, you can see they also use VRAM 2135MHz.
So it could be very well the VRAM is not able to achieve much higher as also many user are experiencing.

Below is table for Sapphire 6900XT Toxic Limited edition.
You can see the VRAM in Toxic Boost mode set to 2135MHz and it also gives indication for maximum achievable GPU frequency on 6900XT of 2666MHz.
This would be like around 2715MHz setting is Radeon Software. I am not sure what voltage Sapphire uses to achieve these high GPU boost.
It would be nice to find out if they undervolt their 6900XT Toxic to lower than 1175mV. 

In the article which is in German they mention that Sapphire allows 332W core GPU to achieve this boost. That is similar to what I did with Red Devil.  
It seems these are good binned chips that they use as they can go higher. My Red Devil achieved only max boost of 2640MHz with similar power settings.









						Sapphire stellt die TOXIC Radeon RX 6900 XT Limited Edition vor - Hardwareluxx
					

Sapphire stellt die TOXIC Radeon RX 6900 XT Limited Edition vor.




					www.hardwareluxx.de


----------



## ratirt (Mar 11, 2021)

turbogear said:


> So when one looks at the configurations of the 6900XT Toxic where Sapphire claims to be the fastest 6900XT, you can see they also use VRAM 2135MHz.
> So it could be very well the VRAM is not able to achieve much higher as also many user are experiencing.
> 
> Below is table for Sapphire 6900XT Toxic Limited edition.
> ...


I really need to check how far can my go. With the core and the memory. Probably similar to what you have achieved with the 6900xt red devil.


----------



## turbogear (Mar 11, 2021)

ratirt said:


> I really need to check how far can my go. With the core and the memory. Probably similar to what you have achieved with the 6900xt red devil.


I was doing some reading about this Sapphire Toxic Limited Edition that I linked above.
It seems that Sapphire has cherry picked chips as they are doing some promises regarding the boost and performance. 
Looks interesting but I think an expensive card as it is also limited to number of samples they will produce:






						SAPPHIRE TOXIC Radeon RX 6900 XT Limited Edition Review
					

It has been nearly 10 years since the last TOXIC graphics card from SAPPHIRE, and boy -- what a release! Fastest RX 6900 XT ever.




					www.tweaktown.com
				









						SAPPHIRE TOXIC Radeon RX 6900 XT Limited Edition Review
					

It has been nearly 10 years since the last TOXIC graphics card from SAPPHIRE, and boy -- what a release! Fastest RX 6900 XT ever.




					www.tweaktown.com


----------



## Felix123BU (Mar 11, 2021)

Even though this is not 100% related to this topic, (75% if you have a last gen 6 core CPU or less) I found this Hardware Unboxed highly amusing and entertaining


----------



## Caring1 (Mar 11, 2021)

Sluppermand said:


> Finaly I were able to pick up my computer from the repair shop, so now I'm part of this club aswell.
> 
> 
> 
> ...


Dual channel Ram is not a Bios setting, it requires at least two sticks of Ram, in the appropriate slots and it will be set automatically, also your temps are good.


----------



## Sluppermand (Mar 11, 2021)

Caring1 said:


> Dual channel Ram is not a Bios setting, it requires at least two sticks of Ram, in the appropriate slots and it will be set automatically, also your temps are good.


The ram is placed in slot #2+4 counting from the cpu - That is as far as I know, the correct way of placing them.
Linus Tech Tips have mentioned something about a bios setting, I'm quite sure that's people who didn't activate dual channel mode, but ofcause, my memory is not perfect.
I will try check up on it.


----------



## Gmr_Chick (Mar 11, 2021)

Felix123BU said:


> Even though this is not 100% related to this topic, (75% if you have a last gen 6 core CPU or less) I found this Hardware Unboxed highly amusing and entertaining



Great video. I could picture all the Nvidia fanboys gathering up their pitchforks while watching it


----------



## Felix123BU (Mar 11, 2021)

Sluppermand said:


> The ram is placed in slot #2+4 counting from the cpu - That is as far as I know, the correct way of placing them.
> Linus Tech Tips have mentioned something about a bios setting, I'm quite sure that's people who didn't activate dual channel mode, but ofcause, my memory is not perfect.
> I will try check up on it.


Dal channel is enabled automatically if you have at least 2 matching sticks in the right positions, so in your case 2-4, or alternatively 1-3 is dual channel. (1-3 or 2-4 is the norm, some boards might go for 1-2, 3-4, but that's rare) Also, some boards have a preferred position for dual channel, as in, 2-4 before 1-3, but you can find that in the motherboards manual. I have not seen any board that would allow you to disable or enable dual channel in years, long ago there where some.



Gmr_Chick said:


> Great video. I could picture all the Nvidia fanboys gathering up their pitchforks while watching it


I am expecting Nvidia to send some ninjas after Steve soon, to deliver another letter about their "editorial direction"


----------



## GamerGuy (Mar 12, 2021)

That'


Felix123BU said:


> Even though this is not 100% related to this topic, (75% if you have a last gen 6 core CPU or less) I found this Hardware Unboxed highly amusing and entertaining
> 
> *SNIP*


That's why I love Steve, he's not afraid to point out nVidia's shortcomings, though I take it this vid was more for edification than anything else. Let's wait and see, nVidia doesn't like it when their cards are featured in a unfavorable light, so they might respond.....


----------



## ratirt (Mar 12, 2021)

GamerGuy said:


> That's why I love Steve, he's not afraid to point out nVidia's shortcomings, though I take it this vid was more for edification than anything else. Let's wait and see, nVidia doesn't like it when their cards are featured in a unfavorable light, so they might respond.....


I think there's not even one company that would like to see their products being shown in a bad light. Wonder what's NV going to do about it. Threats or some product stack adjustments?
Linus rammed NV for their crypto GPU as well. He wasn't as conservative in his choice of words like Steve was.
For those who are interested and haven't seen it.


----------



## GamerGuy (Mar 12, 2021)

Of course, I saw Linus rip nVidia a new one over the threat made by nVidia to Steve over Steve's 'editorial directions', that's why I'm an LTT forum member as well.


----------



## ratirt (Mar 12, 2021)

GamerGuy said:


> Of course, I saw Linus rip nVidia a new one over the threat made by nVidia to Steve over Steve's 'editorial directions', that's why I'm an LTT forum member as well.


The video link there is Linus ripping NV a new one not about Steve and his 'editorial direction' but mining cards exclusively.


----------



## GamerGuy (Mar 12, 2021)

ratirt said:


> The video link there is Linus ripping NV a new one not about Steve and his 'editorial direction' but mining cards exclusively.


Sorry 'bout the mix up, I'd meant I'd seen Linus ripping nVidia a new one over their threat to Steve at the time Steve reached out to LTT last year when the controversy broke out. This one about CMP and basically nVidia bamboozling peeps with their CMP not affecting GPU nonsense. Gotta give it to nVidia's PR and Marketing Division, real they're apex level spin doctors!


----------



## kapone32 (Mar 12, 2021)

Is it just me or does the 6800XT have some seriously rich video settings. Everything seems to be brighter and colours richer.


----------



## Felix123BU (Mar 12, 2021)

kapone32 said:


> Is it just me or does the 6800XT have some seriously rich video settings. Everything seems to be brighter and colours richer.


Assuming you come from an Nvidia card, there is an urban legend that says Nvidia's color compression screws with image quality, and even though AMD also has some compression, its not that strong. I have seen posts with people switching between Nvidia and AMD and reporting similar things, but I have not personally experienced that, so cant comment on how true this is, and the internet is full of ....


----------



## kapone32 (Mar 12, 2021)

Felix123BU said:


> Assuming you come from an Nvidia card, there is an urban legend that says Nvidia's color compression screws with image quality, and even though AMD also has some compression, its not that strong. I have seen posts with people switching between Nvidia and AMD and reporting similar things, but I have not personally experienced that, so cant comment on how true this is, and the internet is full of ....


I am coming from a Vega 64 and 5700. I have no idea about the internet though as I like to prove things for myself as well.


----------



## sepheronx (Mar 12, 2021)

WCCF tech but who knows, maybe it is legit?









						AMD Radeon RX 6700 XT ray tracing performance has been leaked - VideoCardz.com
					

AMD Radeon RX 6700 XT faster than GeForce RTX 3070 without raytracing It looks like Wccftech got their hands on RX 6700 XT performance number, including ray tracing. New performance figures have been shared by Wccftech day. The site has provided information on RX 6700 XT framerates in 1440p...




					videocardz.com


----------



## goosegoose (Mar 12, 2021)

Recently got my 6800xt reference card. Paired with a 3900x with an auto-overclock. GPU running at stock speeds.

Getting lower scores than usual. Did I just get a bad card or something?


----------



## Felix123BU (Mar 12, 2021)

goosegoose said:


> Recently got my 6800xt reference card. Paired with a 3900x with an auto-overclock. GPU running at stock speeds.
> 
> Getting lower scores than usual. Did I just get a bad card or something?
> 
> View attachment 192131


At stock speeds its about right (stock settings are very conservative), the 6800XT really shines when overclocked, and from what I saw from others, have yet to see a "bad" card. I would suggest, if you want to go that route, try some simple OC, like in Wattman set Max Frequency to 2550 or 2600 Mhz, leave voltage at 1150, put memory at 2100 Mhz, put the fans at max and see what score you get with that. The reference cooler can keep those clocks, only that it might get a bit loud.


----------



## goosegoose (Mar 12, 2021)

Felix123BU said:


> At stock speeds its about right (stock settings are very conservative), the 6800XT really shines when overclocked, and from what I saw from others, have yet to see a "bad" card. I would suggest, if you want to go that route, try some simple OC, like in Wattman set Max Frequency to 2550 or 2600 Mhz, leave voltage at 1150, put memory at 2100 Mhz, put the fans at max and see what score you get with that. The reference cooler can keep those clocks, only that it might get a bit loud.


Awesome! Thanks for the advice. I'll try it out.


----------



## Butanding1987 (Mar 14, 2021)

I'm not exactly sure what changed, but this has been my highest graphics score in Time Spy so far. Maybe it's the weather.


----------



## turbogear (Mar 14, 2021)

Butanding1987 said:


> I'm not exactly sure what changed, but this has been my highest graphics score in Time Spy so far. Maybe it's the weather.


Not bad.   
This is actually inline with my score for stable setting.
For daily stable setting I get score between 19750 to 19800 range it varies time to time maybe with weather.


----------



## Felix123BU (Mar 16, 2021)

Did some more RAM testing on my 6800XT, and it seems that for me 2080mhz with fast timings or 2110 with normal timings is the limit before I start getting performance regression.

I guess this is related to the voltage allowed by AMD for the memory and/or controller, that's probably why it is set at max 2150 and does not really go a lot higher. Wonder if its just lower binned memory chips or just a totally stupid random limitation imposed by AMD.


----------



## turbogear (Mar 16, 2021)

Felix123BU said:


> Did some more RAM testing on my 6800XT, and it seems that for me 2080mhz with fast timings or 2110 with normal timings is the limit before I start getting performance regression.
> 
> I guess this is related to the voltage allowed by AMD for the memory and/or controller, that's probably why it is set at max 2150 and does not really go a lot higher. Wonder if its just lower binned memory chips or just a totally stupid random limitation imposed by AMD.


Yes, unfortunately memory is limited that is also what I found out with the 3 cards that I tested. 
My 6800XT has 2100MHz limit.
The 6900XT Red Devil was limited to 2120MHz.
I had on weekend Sapphire 6900XT Toxic Limited Edition. This was limited to 2110MHz.

I think I am giving up the thoughts to get good performance out of 6900XT after getting second disappointing sample. 
Sapphire made big promises for *Sapphire 6900XT Toxic Limited Edition *as written on Hardwareluxx.
I thought this expensive 1800€ card must have cherry picked chip on it but it was the biggest disappointment.

According to the specs that Hardwareluxx got from Sapphire, they were promising in *Toxic boost* *mode* with 2135MHz memory speed and GPU boost up to 2660MHz but the sample I had only reached in the *Toxic boost* *mode * 2090MHz for memory and 2590MHz for GPU boost giving only 20200 Time Spy graphic score.
The manual overclock was unstable above 20500 Time Spy Score with max GPU boosting to 2610MHz and maximum VRAM frequency of 2110MHz.

I was able to reach 20900 score on a setting which I though was stable but then I notices on cold boot and start of Time Spy test the PC crashed.
This same thing happen for any setting giving higher than 20500 Time Spy score. There was a crash in Time Spy test if PC was booted and on cold GPU Time Spy test was started.

Really a big disappointment from a Limited Edition of an expensive 6900XT model. The card has been returned. 









						Sapphire stellt die TOXIC Radeon RX 6900 XT Limited Edition vor - Hardwareluxx
					

Sapphire stellt die TOXIC Radeon RX 6900 XT Limited Edition vor.




					www.hardwareluxx.de
				




Table of specs pasted on Hardwareluxx:



This is the only photo I made from the card. The box was the size of almost three 6900XT Red Devil boxes.


----------



## Gmr_Chick (Mar 16, 2021)

turbogear said:


> Yes, unfortunately memory is limited that is also what I found out with the 3 cards that I tested.
> My 6800XT has 2100MHz limit.
> The 6900XT Red Devil was limited to 2120MHz.
> I had on weekend Sapphire 6900XT Toxic Limited Edition. This was limited to 2110MHz.
> ...



No offense but, you DO realize how petty you sound, right? Whining over the fact that you couldn't get a zillion points on a stupid benchmark from not only a 6900XT but a *Sapphire Toxic Edition 6900XT! *Did you even game on the damn thing before you returned it?


----------



## Felix123BU (Mar 16, 2021)

turbogear said:


> Yes, unfortunately memory is limited that is also what I found out with the 3 cards that I tested.
> My 6800XT has 2100MHz limit.
> The 6900XT Red Devil was limited to 2120MHz.
> I had on weekend Sapphire 6900XT Toxic Limited Edition. This was limited to 2110MHz.
> ...


Keeping in mind that both the 6800, 6800XT and 6900XT use the same ram chips and as far as I know the same underlying memory system, the memory should have the same performance on all 3, more or less. The only difference is in the number of CU's, and the difference in CU's between the 6800XT and 6900XT is to small to make a significant difference. I don't consider 10% more performance for 350 USD extra worth it, not even close. And I still dream of what these cards could do with much faster ram.

The 6900XT for 1000 USD is from my point of view a cash-grab, nothing more, and nobody can suspect me of hating on AMD, I have had only AMD Gpu's for the past 10 years, not because I am such a fan, but because price-performance-longevity wise they where always the best choice..... and I don't have to use a 1995 control interface   

Regarding the Toxic you tested, seem only one thing is toxic about that card, the 1800 EUR price


----------



## turbogear (Mar 17, 2021)

@Gmr_Chick
Please let's respect each other on the forum. I don't appreciate your use of offensive language. 

I am not looking for topping the Benchmark charts. I don't even report the scores online most of the time.
I was expecting to get one of the good overclocking 6900XT like I have seen that there are many samples out there to get higher game performance because I was thinking to move to 4K monitor from the 2K and every percentage of higher FPS will help.
The review for the *6900XT Red Devil Limited Edition *from der8auer was really interesting as he could achieve 2700MHz with that. 
That's what I thought a limited edition could be able to achieve but it seems it is not the case. The only really thing they promise is higher performance than reference 6900XT which could be only a minor higher than 6800XT.

Seeing that Sapphire should boost up to 2666MHz and VRAM frequency 2135MHz according to the above mentioned spec from HardwareLuxx I thought this would be a good performing card.
The higher benchmark score translates to higher gaming performance.

Of course I tried the Borderland benchmark and Assassin's Creed Valhalla benchmark, but the performance increase over my 6800XT with Toxic was only around 3% while consuming around 50W more.
The  6900XT Red Devil that I tested before gave 5% higher FPS in the above two game benchmarks.

So as Felix123BU mentioned only thing Toxic about Sapphire was the price. 
So these two tests I did proves that it is not really worth it to invest higher money in a 6900XT.
Many of the 6800XTs perform really great and come very close to 6900XT in performance.
The performance difference of 10% I think is only if you compare the default setting.
Ones you tune the 6800XT it comes close to a striking distance of 6900XT.
The 6900XT does not scale as high as 6800XT without needing a Nuclear power plant to supply the power to it.   


@Felix123BU
I noticed that it is possible to get higher memory clock by increasing the TDC Limit SoC from 55A to 63A.
I saw before that ASUS on it's 6800XT liquid cooled card has the TDC Limit SoC set to 63A.
I can now increase my VRAM frequency from 2100MHz to 2120MHz which translates to higher performance.

Also ASUS is using on the liquid cooled 6800XT TDC Limit GPX of 364A. I have increased mine to 340A from 320A.
I will not go that high as ASUS card as I have reference card which has only 2x8pin and 10phase VRM.
With these tuning my 6800XT can now go to 2715MHz@1010mV.
This gives boost up to 2660MHzs in Time Spy. At the start of GP2 Test the frequency dips toward 2590MHz.
That is the most intensive part of the test.
I let the GP2 test run in loops for longer time to test the stability.
Time Spy graphic test 2 is nasty and if there is any VRAM or GPU frequency instability this test crashes.
I have not tested this setting in games yet. I will do that in the next days by playing AC Valhalla and Cyberpunk.

Here is the current stable setting that I achieved for 6800XT.


----------



## Felix123BU (Mar 17, 2021)

turbogear said:


> @Gmr_Chick
> Please let's respect each other on the forum. I don't appreciate your use of offensive language.
> 
> I am not looking for topping the Benchmark charts. I don't even report the scores online most of the time.
> ...


One thing I can totally understand is the need for each extra FPS when changing monitors and resolution, had the same "issue" last year, hence the 6800XT   
Its funny, I was running 61A TDC Soc, raised it when I first played with MPT, did not help RAM speed really. Its not like I can't run it above 2100, I can, it does not crash, but anything above 2110 gives a regression in performance, its a repeatable experiment in my case.

This is what I get with my daily settings (335 TDC GFX, 285 Power Limit GPU, 61 TDC Soc) (2650Mhz GPU, 2080Mhz memory)







I can get over 20k by pushing it a bit more, but I don't have the need for that    Very very happy with how its running now!

I have a question, you say you run your GPU voltage at 1010mv, I noticed on mine that whatever I set it always goes to 1150mv on load. Did you limit your max voltage in MPT?


----------



## turbogear (Mar 17, 2021)

By the way just to let the community here know, I report a nasty bug with GPU-Z to W1zzard last Friday, 
He fixed this and released the new v2.38.0 GPU-Z on the same day. 

There was a big issue with GPU-Z and saving of the RDNA2 bioses.
Until v2.38.0, the saved bios size with GPU-Z of the RDNA2 cards (6800,6800XT,6900XT) was only 512kB.
This is wrong. The real bios size of these cards is 1024KB.
So if somebody had attempted to flash the wrong size bios onto their card it will brick it.

W1zzard told me on Friday he will delete all the RDNA2 bios with 512kB size out of the Techpowerup bios database.
So if you go there, you will see that all bioses are gone. Only the ones from reference review samples are there that W1zzard uploaded on Friday.

To help the community here, you can submit your RDNA2 bios to the TPU database through the GPU-Z v2.38.0. 

I think W1zzard will appreciate to have the correct bioses in the database again. 

TechPowerUp GPU-Z v2.38.0 Released | TechPowerUp Forums



Felix123BU said:


> I can get over 20k by pushing it a bit more, but I don't have the need for that    Very very happy with how its running now!
> 
> I have a question, you say you run your GPU voltage at 1010mv, I noticed on mine that whatever I set it always goes to 1150mv on load. Did you limit your max voltage in MPT?


1010mV is just the setting in Radeon. It is not the real voltage that the GPU uses.
The real voltage goes to 1150mV in my case also.
But what I noticed is that the lower you can set this setting in Radeon the higher performance you can achieve with the same power limit.
If I have for example 1030mV instead of 1010mV in the Radeon setting the card boost to lower frequencies. It hits the target 1150mV for lower frequency.

Regarding the VRAM frequency, I had also regression before if I set to anything higher than 2100MHz.
Now after I increased TDC SoC to 63A, I can go to 2120MHz without regression. Now regression starts around 2124MHz for me. So I leave it 4MHz below that.


----------



## Butanding1987 (Mar 17, 2021)

turbogear said:


> By the way just to let the community here know, I report a nasty bug with GPU-Z to W1zzard last Friday,
> He fixed this and released the new v2.38.0 GPU-Z on the same day.
> 
> There was a big issue with GPU-Z and saving of the RDNA2 bioses.
> ...


Submitted my Nitro+ Performance BIOS.




turbogear said:


> @Gmr_Chick
> Please let's respect each other on the forum. I don't appreciate your use of offensive language.
> 
> I am not looking for topping the Benchmark charts. I don't even report the scores online most of the time.
> ...


Can you write a short instruction on how to flash the BIOS using MPT? I read the write-up on Igor's Lab's website and the writing style is kind of difficult to understand. Which flash tool did you use to save your default BIOS and how do you save a new one with the power change? Thanks.


----------



## Athlonite (Mar 17, 2021)

@turbogear you do realize that the cards that reviewers are given are usually cherry picked for just that sort of thing right benching in reviews to make their GPU look the best everything sold to consumers is a 1 in 100 chance of getting a great clocker if your lucky


----------



## turbogear (Mar 17, 2021)

Athlonite said:


> @turbogear you do realize that the cards that reviewers are given are usually cherry picked for just that sort of thing right benching in reviews to make their GPU look the best everything sold to consumers is a 1 in 100 chance of getting a great clocker if your lucky


Yes, but I thought when they release Limited Edition version of a card with only few 100/1000 samples produced then they will cherry pick these SoC to get good performance to show off their brand name  with the limited edition.

One thing I have to add was that they did good job with the cooling.
The Junction temperatures on the Limited edition were all time below 75°C with fan set to 80% and lot of power being pushed into it (370W core power and 368A core TDC set with MPT).
I did not find the fans that loud and the pump for me was also silent.
Igor also did the review for this. He complained all the time about the noisy Asetek pump.
His pump was noisy but his sample had a better performance than mine. 

Toxic boost is also a nice feature. It extracts close to maximum from the card already by just pressing of a button in the Trixx software. 
As I also read on only two reviews I can find, is that it is really not possible to get much higher performance than what the Toxic boost offers.
My sample was underperforming there compared to the reviewed cards.
I tried to push it manually but my sample gave with Toxic Boost a Time Spy Score of 20200 and maximum stable manual OC was 20500.






Butanding1987 said:


> Submitted my Nitro+ Performance BIOS.
> 
> 
> 
> ...


As far as I know it is not possible to flash the bios with Red Bios Editor at the moment.
There is another tool that can be used to flash it, but why you want to flash the bios?

There is a huge risk to brick the card and you cannot at the moment change limits like voltage or frequency of the card anyways.
There is a huge discussion on-going at Igor's Lab forum about it but there is no real solution at the moment to change such limits with RBE.

For adjusting power limits you don't have to flash the bios. You can do such changes with MPT.

In any case, here you can find the tool (ATITool v3.15) which can be used to flash the bios. 
Use it under your own risk: 








						RDNA2 RX6000 Series Owners Thread, Tests, Mods, BIOS & Tweaks !
					

Well.... or just wait a year and buy my used 6900xt for a good price ;):D  I was thinking 6700xt 12gb 400-500$ in march...one can hope. :)




					forums.guru3d.com
				




The only thing I use the BIOSes from other 6800XT for is if I want to load setting into MPT from another 6800XT variant. 
All the BIOSes are the same expect for changes to *Fan Profile, Power limit GPU (W), TDC Limit GPX (A) *as well as *TDC limit SoC (A)*. 
For taking over these setting from another card I don't need to flash the bios but only load other BIOS into the MPT and apply the setting from there.


----------



## Butanding1987 (Mar 17, 2021)

turbogear said:


> Yes, but I thought when they release Limited Edition version of a card with only few 100/1000 samples produced then they will cherry pick these SoC to get good performance to show off their brand name  with the limited edition.
> 
> One thing I have to add was that they did good job with the cooling.
> The Junction temperatures on the Limited edition were all time below 75°C with fan set to 80% and lot of power being pushed into it (370W core power and 368A core TDC set with MPT).
> ...


I thought you have to flash the bios to be able to apply the new power limits. How do you force Wattman to recognize the new settings? Thanks.


----------



## turbogear (Mar 17, 2021)

Butanding1987 said:


> I thought you have to flash the bios to be able to apply the new power limits. How do you force Wattman to recognize the new settings? Thanks.


For that you only need the MorePowerTool (MPT).
This does not flash the bios.
It only uses the bios to write setting into Radeon Software SPPT (Soft Power Play Tables).
These are tables in the Radeon Software. So every time you do a driver update, you need to apply these settings again as these setting are then lost.

You need a copy of a Bios file and then you load it into the MPT tool by using the load button.
You can save your own bios by GPU-Z locally and then load it into MPT to see the default setting.

After that you can go to the Power and Voltage tab and change the settings marked in yellow in screen shot below as you need them.
After you are finished with editing the setting, just press Write SPPT and then reboot the computer.
You need to reboot for the setting to be applied.

Please note, that you should not try to increase the voltage range and frequency range. These are hard locked by Radeon Driver.
If you change these settings, after reboot your card will go to Safe Mode and the frequency will be stuck at 500MHz.
In that case you need to restore the default setting by loading the bios again into MPT and applying the settings and rebooting.

You can decrease the voltage but you cannot increase it at the moment.
Maybe in future Igor's Lab community which created this MPT tool may find a workaround for that but it does not work until now.

Here you can find a guide how to use this utility:








						The complete Big Navi UV Guide: Undervolting and power saving with the MorePowerTool simply explained | Practice | igor'sLAB
					

New year, new luck! Due to various inquiries and great interest in the topic, I have put together a detailed UV guide for you here, since I had a small craft project pending anyway.




					www.igorslab.de


----------



## Felix123BU (Mar 17, 2021)

Athlonite said:


> @turbogear you do realize that the cards that reviewers are given are usually cherry picked for just that sort of thing right benching in reviews to make their GPU look the best everything sold to consumers is a 1 in 100 chance of getting a great clocker if your lucky


Generally speaking you are correct, but for the 6000 series I have yet to see a review of a card that has the performance mine has, or the scores present on this thread. That's probably because they are ran very conservatively at stock, and no review really bothered to overclock it seriously. That being said, I think you can "forgive" some of us that hope they can get a better than average binned chip, since the average seems quite high with these cards 



turbogear said:


> By the way just to let the community here know, I report a nasty bug with GPU-Z to W1zzard last Friday,
> He fixed this and released the new v2.38.0 GPU-Z on the same day.
> 
> There was a big issue with GPU-Z and saving of the RDNA2 bioses.
> ...


Its funny, I ran the SOC TDC at 61A, raised it for kicks to 63A, and yup, can run it 20Mhz higher   
Simple man's math would say each amp extra equates to 10Mhz more, might test it out, though doubt it will work, the SOC TDC readings are well below 63A even when memory is fully loaded, so that should not be a bottleneck.
Without being able to change memory voltage there wont be any magical jumps.

*AMD, IF ANY OF YOU CAN READ THIS, UNLOCK THE 6000 SERIES!!!!!*


----------



## turbogear (Mar 17, 2021)

Felix123BU said:


> Generally speaking you are correct, but for the 6000 series I have yet to see a review of a card that has the performance mine has, or the scores present on this thread. That's probably because they are ran very conservatively at stock, and no review really bothered to overclock it seriously. That being said, I think you can "forgive" some of us that hope they can get a better than average binned chip, since the average seems quite high with these cards
> 
> 
> Its funny, I ran the SOC TDC at 61A, raised it for kicks to 63A, and yup, can run it 20Mhz higher
> ...


Man we are geniuses.  
Good that my discovery also worked for you. 

The  memory needs voltage unlock to tune it further. We need bios moding. 

I was doing some brainstorming yesterday and thinking about how the DDR is usually interfaced to SoC so I thought why not try to raise SoC TDC target and try if it can help with GDDR frequency. It helped for me and I shared this with you.

Now my important question is how high this values can be allowed to increase to reach 2150MHz.
63A was the value that ASUS uses for 6800XT Strix LC so I thought they will not use an unsafe value. 

As you mentioned this generation of GPU from AMD are great performers and lot of cards out there are really great for tuning.

Unlike some average users who buy their GPU and put it into PC, install drivers and go to games, many of us here in forum love our hardware and have in addition to gaming a hobby of tuning our system to get best possible performance out of it. 

As you also said most of the reviews in the internet did not bother much with tuning these GPUs when they did reviews.
What they tried was very simple overclock just by raising power limit and shifting frequency bar higher without trying to tune voltage or play around with tools like MPT, etc.


----------



## Felix123BU (Mar 17, 2021)

turbogear said:


> Man we are geniuses.
> Good that my discovery also worked for you.
> 
> The  memory needs voltage unlock to tune it further. We need bios moding.
> ...


  agree with all you said, hence, I will say it again:

*AMD, IF YOU SEE THIS, FREE THE 6000 SERIES!!! *


----------



## Butanding1987 (Mar 18, 2021)

turbogear said:


> @Gmr_Chick
> Also ASUS is using on the liquid cooled 6800XT TDC Limit GPX of 364A. I have increased mine to 340A from 320A.
> I will not go that high as ASUS card as I have reference card which has only 2x8pin and 10phase VRM.


I think the AMD reference cards use a 12-phase GPU VRM design. So maybe you should push it further.


----------



## turbogear (Mar 18, 2021)

Butanding1987 said:


> I think the AMD reference cards use a 12-phase GPU VRM design. So maybe you should push it further.


Well yes it could be pushed further,  but it is only 10-phases for Vcore on reference 6800XT card and not 12.  
In total there are 15- phases; 10 for Vcore, 2 for SoC and 3 for VRAM.

In any case each phase theoretically can deliver 70A.  So total of around 700A, but I am not sure how high one can push it without needing lot of cooling for VRM.
We are also not sure how many amperes the PCB is designed to handle. Pushing it too high on regular bases could cause damage to the PCB.
That would not be problem if one is just pushing it for short time to run some benchmarks. 

In any way, I did also some research for PSU cables also.  If one has 18AWG high quality cable then it is possible to push up to 8Ax3×12V=288W per 8pin cable. There are 3 power carrying lines in 8pin cable. In 6pin there are only 2.

So if one uses two separate cables from PSU to GPU, it should be possible to get up to 570W on them, but I personally would not like to push more than 450W over cable.
More current causes the cable to get hot and voltage drop on them increases.


----------



## goosegoose (Mar 18, 2021)

I'm not sure what I'm doing wrong. I'm getting bad scores even with an overclock. Test.

Is it my CPU or my motherboard? I'm using a 3900x with a slight undervolt and a Gigabyte GA-AX370 Gaming k5.


----------



## Athlonite (Mar 18, 2021)

@goosegoose that looks like your mobo is the bottleneck


----------



## goosegoose (Mar 18, 2021)

Athlonite said:


> @goosegoose that looks like your mobo is the bottleneck


Could it have to do with PCIE 3.0 vs 4.0?


----------



## turbogear (Mar 18, 2021)

goosegoose said:


> Could it have to do with PCIE 3.0 vs 4.0?


Also try not to push VRAM that far. 
On many RDNA2 cards the score actually drops if memory is pushed that far.
Mine does not like much higher than 2100MHz.
It can go to 2120MHz if I push SoC TDC to 63A with MPT. Default is 55A.


----------



## goosegoose (Mar 18, 2021)

turbogear said:


> Also try not to push VRAM that far.
> On many RDNA2 cards the score actually drops if memory is pushed that far.
> Mine does not like much higher than 2100MHz.
> It can go to 2120MHz if I push SoC TDC to 63A with MPT. Default is 55A.


Hmm. I'll try that out.


----------



## GamerGuy (Mar 18, 2021)

Just curious, anyone here got their mitts on the RX 6700 XT, or at the very least, have successfully ordered one?


----------



## Felix123BU (Mar 18, 2021)

goosegoose said:


> I'm not sure what I'm doing wrong. I'm getting bad scores even with an overclock. Test.
> 
> Is it my CPU or my motherboard? I'm using a 3900x with a slight undervolt and a Gigabyte GA-AX370 Gaming k5.
> 
> View attachment 192908View attachment 192909


I think I know why your scores are lower than some here, its the 3900X. Had basically the same thing happening while I was on a 3800X, the jump to a 5800X changed things considerably.
I would not say the experience with the 6800XT was bad paired with the 3800X, but its on a different level with the 5800X. I can literally feel games being a lot smoother with the 5800X combined with my 6800XT.

What I can not do is test it again with the 3800X which went to my wife, the 6800XT is in a water loop and changing back to the stock cooler is not an option.

From memory the best Time Spy score I got with the 3800X was 17800-ish, but that was with system RAM tuned heavily aswell, that also helps a lot.
If getting a Ryzen 5000 series CPU is not an option, I would not really bother with the Time Spy score, those are more for show, I would see how actual gaming is. If you are satisfied with how you can game on it, I would really not bother with a CPU upgrade, or Time Spy scores   

And also, what @turbogear said, memory should not be pushed past 2100mhz, over 2100mhz there are good chances it will reduce performance, in Time Spy and actual games too. I would start memory OC at 2050 and go up in increments of 10 until I get another score reduction.


----------



## goosegoose (Mar 18, 2021)

Felix123BU said:


> I think I know why your scores are lower than some here, its the 3900X. Had basically the same thing happening while I was on a 3800X, the jump to a 5800X changed things considerably.
> I would not say the experience with the 6800XT was bad paired with the 3800X, but its on a different level with the 5800X. I can literally feel games being a lot smoother with the 5800X combined with my 6800XT.
> 
> What I can not do is test it again with the 3800X which went to my wife, the 6800XT is in a water loop and changing back to the stock cooler is not an option.
> ...


Thanks a lot! That's the best info I've heard yet.


----------



## Felix123BU (Mar 18, 2021)

GamerGuy said:


> Just curious, anyone here got their mitts on the RX 6700 XT, or at the very least, have successfully ordered one?


I just check my local stores, they are available as we speak, but oh my sweet lord, the prices are insaaaaaaaaane.......
1100 EUR for a GIGABYTE Radeon RX 6700 XT EAGLE and 1150 EUR for a GIGABYTE Radeon RX 6700 XT GAMING OC


----------



## evolucion8 (Mar 18, 2021)

Had anyone have issues with Control and Battlefield crashing when using Ray Tracing with the RX 6900XT? Is just weird that it does not happen with Quake RTX, haven't tried Call of Duty. I wonder if there is issues with AMD's drivers related to Ray Tracing.


----------



## GamerGuy (Mar 18, 2021)

Felix123BU said:


> I just check my local stores, they are available as we speak, but oh my sweet lord, the prices are insaaaaaaaaane.......
> 1100 EUR for a GIGABYTE Radeon RX 6700 XT EAGLE and 1150 EUR for a GIGABYTE Radeon RX 6700 XT GAMING OC


Holy....Shiat! That's insane!!!  My Sapphire RX 6900 XT cost me 1300+ USD (when I got it late December/early Jan), that's effectively 1105 EUR at present conversion rate, so my RX 6900 XT is actually cheaper than the Gigabyte RX 6700 XT Gaming OC!

Edit - BTW, just noted your location, I'm a Babylon 5 man myself....


----------



## Felix123BU (Mar 18, 2021)

GamerGuy said:


> Holy....Shiat! That's insane!!!  My Sapphire RX 6900 XT cost me 1300+ USD (when I got it late December/early Jan), that's effectively 1105 EUR at present conversion rate, so my RX 6900 XT is actually cheaper than the Gigabyte RX 6700 XT Gaming OC!
> 
> Edit - BTW, just noted your location, I'm a Babylon 5 man myself....


He is behind me, you are in front of me   

Btw, those prices are not ebay like sites, but normal e-shops, I got my 6800XT for 700 EUR, which is basically MSRP where I live, so seeing these 6700XT prices made me laugh hard


----------



## GamerGuy (Mar 18, 2021)

Felix123BU said:


> He is behind me, you are in front of me


Delenn said that! (Referring to Sheridan, who was in B5 behind her ships, and the puny Earth Force ships in front of her). Told ya I'm B5 man! 

Got the series on DVD, including the movies....sorry guys, I kinda went off topic. I get kinda excited when I talk about B5.


----------



## turbogear (Mar 18, 2021)

So installed the new driver 21.3.1, and just raised the values in the MPT a little.  Power Limit GPU from 284W to 289W and  TDC limit GFX from 340A to 350A.
The frequency and other settings are still the same as I posted here before look at the post:   
https://www.techpowerup.com/forums/threads/the-rx-6000-series-owners-club.276164/post-4480167

This is how my score looks.
I could not believe my eyes that it is 20337. 

The Sapphire 6900XT Toxic that I tested produced 20500 max by pushing 368W core and 384A TDC and by default only 20200. 

There is an orange exclamation mark in the score as driver is new and not verified by 3DMark.


----------



## Felix123BU (Mar 18, 2021)

turbogear said:


> So installed the new driver 21.3.1, and just raised the values in the MPT a little.  Power Limit GPU from 284W to 289W and  TDC limit GFX from 340A to 350A.
> The frequency and other settings are still the same as I posted here before look at the post:
> https://www.techpowerup.com/forums/threads/the-rx-6000-series-owners-club.276164/post-4480167
> 
> ...


any driver issues with the new one? I don't want to jinx my luck, the driver I have is spotless


----------



## turbogear (Mar 18, 2021)

Felix123BU said:


> any driver issues with the new one? I don't want to jinx my luck, the driver I have is spotless


Just installed the driver today 1.5hours ago.
I don't know if there are any issues.

I re-ran Time Spy to check if it was one time shot or I can repeat it.
Same setting as above, but it got even better.  

Just crazy. It seems  that the new driver are helping. 







I never tried to look in the 3DMark database before how high is my score was as I never cared about it but just now I was looking there and saw that my old best score was number 20 for 5800X processor in combination with 6800XT.





That was with unstable 2750MHz@1030mV setting.
The new setting are 2715MHz@1010mV which are so far stable in stress tests. 
I need to check with games later.
The new AMD drive has a stress test built into it.

The new results are not verified by 3DMark as they don't recognize the new driver yet.  The new score would be 8th place. 

Editted:
I also re-ran the Port Royal again to check if I can reproduce the above result of 10336.
I came close to it with second run 10334.  



The old Port Royal result was also number 20th and the new one would be about 9th.


----------



## Felix123BU (Mar 18, 2021)

turbogear said:


> Just installed the driver today 1.5hours ago.
> I don't know if there are any issues.
> 
> I re-ran Time Spy to check if it was one time shot or I can repeat it.
> ...


Interesting, I knew about the internal stress test, read the driver release page, will give it a go, felt from the start that this time AMD prioritized stability first, I am really impressed with the current driver, not one issue of any kind in 3 months



GamerGuy said:


> Delenn said that! (Referring to Sheridan, who was in B5 behind her ships, and the puny Earth Force ships in front of her). Told ya I'm B5 man!
> 
> Got the series on DVD, including the movies....sorry guys, I kinda went off topic. I get kinda excited when I talk about B5.


Love B5 aswell, and to stay on topic, I wish the GPU manufacturers would go back to making cool designs like they used to, I would die for a 6800 series GPU with a B5 theme   Todays GPU's look more like fancy toasters


----------



## turbogear (Mar 18, 2021)

I ran some game benchmarks for the settings that I pasted preciously under post:








						The RX 6000 series Owners' Club
					

Just curious, anyone here got their mitts on the RX 6700 XT, or at the very least, have successfully ordered one?  I just check my local stores, they are available as we speak, but oh my sweet lord, the prices are insaaaaaaaaane....... 1100 EUR for a GIGABYTE Radeon RX 6700 XT EAGLE and 1150 EUR...




					www.techpowerup.com
				





Borderland 3 benchmark:
The gain is around 10.4% on average FPS.




Results from Assassin's Creed Valhalla.
The gain is around 8.8% on average FPS.

With the 2715MHz@1010mV:



With default GPU settings:


----------



## ratirt (Mar 19, 2021)

turbogear said:


> So installed the new driver 21.3.1, and just raised the values in the MPT a little. Power Limit GPU from 284W to 289W and TDC limit GFX from 340A to 350A.
> The frequency and other settings are still the same as I posted here before look at the post:
> https://www.techpowerup.com/forums/threads/the-rx-6000-series-owners-club.276164/post-4480167
> 
> ...


Are you saying the new driver boosts performance? Haven't installed it yet. Maybe I should look into this


----------



## Felix123BU (Mar 19, 2021)

turbogear said:


> I ran some game benchmark for the settings that I pasted preciously under post:
> 
> 
> 
> ...


is that with the 6800XT and the new driver? 8-10%?????? Thats a lot for one driver revision, and I have not seen any notes in the driver release regarding improved performance


turbogear said:


> Just installed the driver today 1.5hours ago.
> I don't know if there are any issues.
> 
> I re-ran Time Spy to check if it was one time shot or I can repeat it.
> ...


Ok, installed the driver....and booom, Time Spy went to 20512 (+700 points) and Port Royal to 10235 (+300 points) with the exact same settings as before...

Started CP2077, I have a save in the middle of the city where I "benchmark" the frames, old driver in that spot was 62-63, new driver is 69-70, that's insane, easily the most hard section on the GPU in that game for me!







Need to test this further, this are some tangible gains, almost to good to be true


----------



## turbogear (Mar 19, 2021)

Felix123BU said:


> is that with the 6800XT and the new driver? 8-10%?????? Thats a lot for one driver revision, and I have not seen any notes in the driver release regarding improved performance
> 
> Ok, installed the driver....and booom, Time Spy went to 20512 (+700 points) and Port Royal to 10235 (+300 points) with the exact same settings as before...
> 
> ...


Yes that is my experience also.
It seems that AMD did something in the background without putting note into the change log.  
The performance is uplifted.
My card was really well tuned before so uplift comparing to my previous result under same setting is 2% in 3DMARK benchmarks.
It is 3% in Borderland 3 and 2% in Assassin's creed valhalla.

It is 9-10% comparing to the GPU running at default settings. 



ratirt said:


> Are you saying the new driver boosts performance? Haven't installed it yet. Maybe I should look into this


Yes it looks like that as also @Felix123BU noticed some nice performance boost with his 6800XT.

You can try it also and let us know if you also see improvement on 6900XT performance.


----------



## Butanding1987 (Mar 19, 2021)

turbogear said:


> Just installed the driver today 1.5hours ago.
> I don't know if there are any issues.
> 
> I re-ran Time Spy to check if it was one time shot or I can repeat it.
> ...


I need to go home and test this.


----------



## pat-roner (Mar 19, 2021)

Installed it and saw a 400score increase in Time Spy. AMD Radeon Software is no longer working though..

@Felix123BU did you get that 20k Graphics score with a 6800XT or 6900XT?


----------



## Space Lynx (Mar 19, 2021)

pat-roner said:


> Installed it and saw a 400score increase in Time Spy. AMD Radeon Software is no longer working though..
> 
> @Felix123BU did you get that 20k Graphics score with a 6800XT or 6900XT?



when you install beta drivers you really should on the install screen, select clean install, and check the box that says keep user settings (you dont want to keep user settings), clean install prob won't give you any issues. amd software working just fine for me.

i'm getting higher FPS in games. noticeably higher.  well done AMD. well done.


----------



## pat-roner (Mar 19, 2021)

Yeah, will try to do a clean install later today. I just prefer doing a normal install, and reporting any bugs to AMD to help them out finding any bugs that I encounter


----------



## turbogear (Mar 19, 2021)

pat-roner said:


> Yeah, will try to do a clean install later today. I just prefer doing a normal install, and reporting any bugs to AMD to help them out finding any bugs that I encounter


What really helps sometimes if you have driver issues is doing a cleaning with DDU (Display Driver Uninstaller).
Here is the link:








						Display Driver Uninstaller Download version 18.0.5.9
					

Here you can Download Display Driver Uninstaller, this Display Driver Uninstaller is a driver removal utility that can help you completely uninstall AMD/NVIDIA graphics card drivers and packages from your system, with...




					www.guru3d.com


----------



## Space Lynx (Mar 19, 2021)

pat-roner said:


> Yeah, will try to do a clean install later today. I just prefer doing a normal install, and reporting any bugs to AMD to help them out finding any bugs that I encounter



When a new driver comes out on my gtx 1070 laptop or my all AMD desktop, for a decade straight now when it asks if I want to do clean install of driver, i say yes, and no do not keep my settings. i have never had a single issue in all ten years.

this is the way.



turbogear said:


> What really helps sometimes if you have driver issues is doing a cleaning with DDU (Display Driver Uninstaller).
> Here is the link:
> 
> 
> ...



never had to use that. i would avoid that unless my advice doesn't work, make that plan B @pat-roner


----------



## turbogear (Mar 19, 2021)

pat-roner said:


> Installed it and saw a 400score increase in Time Spy. AMD Radeon Software is no longer working though..
> 
> @Felix123BU did you get that 20k Graphics score with a 6800XT or 6900XT?


Me as well as @Felix123BU both have 6800XT. We both now have higher than 20K Time Spy score.


----------



## Space Lynx (Mar 19, 2021)

turbogear said:


> Me as well as @Felix123BU both have 6800XT. We both now have higher than 20K Time Spy score.



i'm getting around 15 fps more in WoW. no settings changed. can't believe it honesty, that's a pretty good free boost... really surprised the drivers don't advertise this. insane.


----------



## turbogear (Mar 19, 2021)

lynx29 said:


> never had to use that. i would avoid that unless my advice doesn't work, make that plan B @pat-roner


I had never had a problem with DDU. I used it multiple times in the last weeks especially when I was switching GPUs from 6800XT to 6900Xt and back to 6800XT.


----------



## Space Lynx (Mar 19, 2021)

turbogear said:


> I had never had a problem with DDU. I used it multiple times in the last weeks especially when I was switching GPUs from 6800XT to 6900Xt and back to 6800XT.



I'm not telling him not to use it, just saying if my way works and he has no issues, I wouldn't bother.


----------



## Felix123BU (Mar 19, 2021)

pat-roner said:


> Installed it and saw a 400score increase in Time Spy. AMD Radeon Software is no longer working though..
> 
> @Felix123BU did you get that 20k Graphics score with a 6800XT or 6900XT?


With the 6800XT
I am still rather surprised by the increase that happens across everything I tested with this driver, though had not a lot of time to test too much, but very curious to do later today. Sweet driver, I would like more of this  



lynx29 said:


> I'm not telling him not to use it, just saying if my way works and he has no issues, I wouldn't bother.


Both the Clean install via the driver install and DDU work in theory the same, I also used Clean Install via setup or the AMD cleanup utility mostly until one time it did not remove everything and I had issues with some leftovers, so from then I only used DDU on each driver install and never had any issues from that time on. But again, in theory, both should achieve the same. I guess a lot of "driver issues" where historically happening because drivers where installed on top of each other


----------



## Space Lynx (Mar 20, 2021)

OMG I BROKE 17,000 TIMESPY WITH NEW DRIVERS!!!

THESE DRIVERS ARE FREAKIN INSANE!!!  @W1zzard please do a new test against rtx 3080 with these bad boy drivers, AMD has knocked it out of the park mate!!!  

$579 MSRP and I am going toe to toe in the boxing ring with a 3080 boys across the board!!! yeeeeha!!!!!!!!!!!!!!  

AS A TEENAGER WHEN I COULD ONLY AFFORD AMD!!! AND NOW!!! HISTORY HAS REPEATED ITSELF!!! ITS GOOD TO BE BACK BOYS!!! HELL YEAH!!!!!!!!!!


----------



## Butanding1987 (Mar 20, 2021)

I'm not quite sure what to make of stability because TS and PR seem to require different settings for optimum performance. Time Spy likes high clock rates, while Port Royal hates it, at least in my case. Anyway, here are my results.


----------



## Space Lynx (Mar 20, 2021)

Butanding1987 said:


> I'm not quite sure what to make of stability because TS and PR seem to require different settings for optimum performance. Time Spy likes high clock rates, while Port Royal hates it, at least in my case. Anyway, here are my results.
> 
> View attachment 193127View attachment 193128



I just ran WoW for an hour which is a shockiling good stability test. and 0 issues.  my OC is stable. wow... I can't believe I paid $579 for a 2080 ti slayer.  what a time to be alive. AMD is back. rock solid since i bought it on launch day. man this is great news for the industry. mining can go piss off, im keeping my rig prime condition


----------



## Butanding1987 (Mar 20, 2021)

lynx29 said:


> I just ran WoW for an hour which is a shockiling good stability test. and 0 issues.  my OC is stable. wow... I can't believe I paid $579 for a 2080 ti slayer.  what a time to be alive. AMD is back. rock solid since i bought it on launch day. man this is great news for the industry. mining can go piss off, im keeping my rig prime condition


You do sound like a teenager.


----------



## Space Lynx (Mar 20, 2021)

Butanding1987 said:


> You do sound like a teenager.



thanks they were fun times. glad to relive them.


----------



## Felix123BU (Mar 20, 2021)

lynx29 said:


> I just ran WoW for an hour which is a shockiling good stability test. and 0 issues.  my OC is stable. wow... I can't believe I paid $579 for a 2080 ti slayer.  what a time to be alive. AMD is back. rock solid since i bought it on launch day. man this is great news for the industry. mining can go piss off, im keeping my rig prime condition


The 6000 cards provoke euphoria     I am also reliving the HD 4870 feelings


----------



## turbogear (Mar 20, 2021)

Butanding1987 said:


> I'm not quite sure what to make of stability because TS and PR seem to require different settings for optimum performance. Time Spy likes high clock rates, while Port Royal hates it, at least in my case. Anyway, here are my results.
> 
> View attachment 193127View attachment 193128


You results are again very close to mine.  

My setting is very stable though. I ran stability tests yesterday and it was solid.

With new drive for first time ever I was able to get my 6800XT stable close to 2750MHz setting.  
Real frequency in range of 2670-2680MHz.
In light load scenes it hit 2700MHz.

My current setting is 2745MHz@1020mV. VRAM at 2110MHz.
MPT settings of 289W, 350A TDC and SoC TDC 63A, power limit in Radeon at 15%.

Today I will try by increasing TDC to 360A.
The score can go higher. It is at the moment TDC/power limited in Time Spy Graphics Test 2.
I may also increase power limit to 295W that would be 339W on core and total of around 384W for graphic card.
I am not sure if I should push it further as I have reference card with only 10phase Vcore.
I think yours has 14 phase.


*Please AMD gives us more of such drivers. *


----------



## Chomiq (Mar 20, 2021)

GamerGuy said:


> Holy....Shiat! That's insane!!!  My Sapphire RX 6900 XT cost me 1300+ USD (when I got it late December/early Jan), that's effectively 1105 EUR at present conversion rate, so my RX 6900 XT is actually cheaper than the Gigabyte RX 6700 XT Gaming OC!
> 
> Edit - BTW, just noted your location, I'm a Babylon 5 man myself....


There you go:



€1300.

6900 XT's are ca. €1950 now in Poland.


----------



## Felix123BU (Mar 20, 2021)

Chomiq said:


> There you go:
> View attachment 193150
> €1300.
> 
> 6900 XT's are ca. €1950 now in Poland.


Eastern Europe is the place to price gouge    Same in my Eastern European country  And they are selling out


----------



## Butanding1987 (Mar 20, 2021)

turbogear said:


> You results are again very close to mine.
> 
> My setting is very stable though. I ran stability tests yesterday and it was solid.
> 
> ...


I think part of my problem is my CPU, which is running too hot (reaches 80 C when running just two cores). I might have made a bad job of applying paste so I'm going to redo it as soon as the thermal paste I ordered from China arrives.


----------



## GamerGuy (Mar 20, 2021)

Chomiq said:


> There you go:
> View attachment 193150
> €1300.
> 
> 6900 XT's are ca. €1950 now in Poland.


Even Holier Shiat!!!  An RX 6700 XT that cost more than my Nitro+ RX 6900 XT!!!

Man, I'm feeling real good about making the jump, a leap of faith so to say, when I'd gotten the Nitro+ RX 6900 XT when I did.


----------



## Athlonite (Mar 20, 2021)

Yeah so happy I bought my Sapphire Nitro+ RX6800 16GB OC when I did I checked the prices of RX6700XT's here in Gougelandastan (New Zealand) and holy shit they cost more than I paid for the RX6800 
Sapphire Nitro+ RX6800 16GB OC = $1295NZD inc GST the Gigabyte Radeon RX6700XT 12GB Aorus Elite = $1,493 85NZD inc GST that's 200 efing bucks more


----------



## Space Lynx (Mar 20, 2021)

Felix123BU said:


> The 6000 cards provoke euphoria     I am also reliving the HD 4870 feelings



my nostalgia stems from 3-4 games in 1997-2004 range, AGP ATi card days.  could never afford nvidia/intel, so all my gaming memories were thanks to AMD. that is why I decided to go all AMD on my new rig last year. purely to honor them for being there for me when I was a kid/teen.

really nice to see my $579 card smashing a $1400 2080 ti.


----------



## Felix123BU (Mar 20, 2021)

I still wonder, since all here ho tried the new driver noticed a significant performance gain, why is there 0 reference about  performance gains in the driver release notes? Been last 10 years on AMD Gpu's, previous 5 with Nvidia, this is the biggest performance leap I have noticed with one driver update


----------



## turbogear (Mar 20, 2021)

Butanding1987 said:


> I think part of my problem is my CPU, which is running too hot (reaches 80 C when running just two cores). I might have made a bad job of applying paste so I'm going to redo it as soon as the thermal paste I ordered from China arrives.


For the Zen 3 that is kind of normal I would say depending on if you have a special chiplet design ready water block or not.

Ask @Felix123BU about the Ryzen chiplet cooling.
He did intesive research on that. I had ones a detailed private chat with him on the topic. 

I don't know which water block you have but not all are ideal to cool the hot spot on Ryzen 5900X chiplet.

Based on Felix123BU research I bought TechN Zen 3 water block for my 5800X.  








						CPU Waterblock AMD Ryzen - Techn GmbH
					

The Waterblock TechN for CPU AM4/AM5 is a watercooling system for CPU AM4/AM5. As a result it enhance performance of your PC while decreasing noise perturbation.




					techn.de
				




Max temperatures are below 78°C under stress on all cores but usually much lower in range of 65°C.

For thermal compound I am using Thermal Grizzly Kryonaut which is among the best performing compounds:








						Thermal Grizzly High Performance Cooling Solutions - Kryonaut
					

Hochwertige Wärmeleitlösungen für Computerchips




					www.thermal-grizzly.com
				




Felix123BU is using this offset bracket with his block to offset the block so that it sits on hot spot: 





						RYZEN 3000 OC Bracket – der8auer
					






					der8auer.com


----------



## Butanding1987 (Mar 20, 2021)

turbogear said:


> For the Zen 3 that is kind of normal I would say depending on if you have a special chiplet design ready water block or not.
> 
> Ask @Felix123BU about the Ryzen chiplet cooling.
> He did intesive research on that. I had ones a detailed private chat with him on the topic.
> ...


Thanks. I'm using a giant Bykski block that's compatible with Threadripper, so I don't think coverage is a problem. I'm also using liquid metal but I think I applied too little (ran out of it). I'll reapply it and see what happens.


----------



## Felix123BU (Mar 20, 2021)

turbogear said:


> For the Zen 3 that is kind of normal I would say depending on if you have a special chiplet design ready water block or not.
> 
> Ask @Felix123BU about the Ryzen chiplet cooling.
> He did intesive research on that. I had ones a detailed private chat with him on the topic.
> ...


Yes, I also had issues while building my first water loop with cooling down a Ryzen 5800X. Funnily enough I built the loop mainly for the 6800XT. Lots of people report very high temps on the 5800X, and that's because the majority of coolers are built for cooling CPU's whos heat center is in the middle of the chip, Ryzen's have the hotspot displaced towards the bottom mostly because of chiplet positioning (depending on how the socket is oriented on the motherboard)


----------



## turbogear (Mar 20, 2021)

turbogear said:


> You results are again very close to mine.
> 
> My setting is very stable though. I ran stability tests yesterday and it was solid.
> 
> ...


I did some tests with the new setting of 2745MHz@1020mV.  VRAM at 2110MHz. This setting is really stable with 99.3% Firestrike Stress test score. 

I increased the power targets. MPT settings of 295W, 360A TDC and SoC TDC 63A, power limit in Radeon at 15%.

Here are my scores. 
It is really unbelievable. I was never able to get my 6800XT stable for such high results before. The new driver is amazing.


----------



## Felix123BU (Mar 20, 2021)

turbogear said:


> I did some tests with the new setting of 2745MHz@1020mV.  VRAM at 2110MHz. This setting is really stable with 99.3% Firestrike Stress test score.
> 
> I increased the power targets. MPT settings of 295W, 360A TDC and SoC TDC 63A, power limit in Radeon at 15%.
> 
> ...


Matched your MPT settings, game on      Here goes:







That's as far as I am willing to go power wise, but love how the reference 6800XT happily sips more power and gives more performance for each couple of watts extra 

I'm also in a love relationship with the new driver


----------



## Butanding1987 (Mar 21, 2021)

Did you guys use the same Wattman settings to get these scores for Time Spy and Port Royal?


----------



## Felix123BU (Mar 21, 2021)

Butanding1987 said:


> Did you guys use the same Wattman settings to get these scores for Time Spy and Port Royal?


Wattman for the clock settings, and extra MPT for raising the power limit, i used the exact power limits as @turbogear posted above with MPT, and my clocks settings are the below ones:


----------



## Butanding1987 (Mar 21, 2021)

Butanding1987 said:


> Did you guys use the same Wattman settings to get these scores for Time Spy and Port Royal?


Same settings for TS and PR?


----------



## Felix123BU (Mar 21, 2021)

Butanding1987 said:


> Same settings for TS and PR?


For me the answer is YUP, same settings

And since its Sunday, and I really have nothing better to do at this hour, went and checked some more performance differences with the new driver. My other "benchmark game" is Shadow of the Tomb Raider.
After I ran the benchmark, I was like "nooooo waaaay"   

Last time I ran that benchmark, at 4K Highest settings, I got 88 FPS, today I got 103......!! Thats a difference of 15 extra frames....at 4K. Ok, the last one was with normal power limits but still OC-ed to the maxx, but even so, 15FPS gain at 4K in SOTR is monstrous. Same exact game version, same everything else.








This is not "AMD Finewine", this is "AMD Finewine on crack"


----------



## turbogear (Mar 21, 2021)

I played on Saturday night Cyberpunk for 3 hours with the settings that I described before with a 2745MHz@1020mV. 
The average values for different items I recorded over the 3 hours run are as follows:
- Average frequency 2699.6MHz 
- Average GPU core power consumption 281.4
- Average VRAM frequency 2098.5MHz
- Average Framerate 91.7 

Here is the frequency distribution:
Frequency sometimes hit 2716MHz, but mostly was in range of 2699MHz with only 9 values recorded below 2690MHz. 



Frame rate distribution:
Frame rate dip below 70 was recorded only 32 times during this run from a total of 4711 data points.
Frame rate in range of 100-125 was recorded 840 times.
There were some idle time where I paused the play a few minutes. A short break. There the frame rate hit 140 but I removed those from evaluation.




GPU Core power consumption distribution:
The power consumption on average of 281W for this demanding game is really good with the high OCing that I have here. 
The peaks for GPU core power are most of time lower than 295W with only a very few time during 3 hours a data point hitting 325W as can be seen from the distribution curve.


----------



## jesdals (Mar 21, 2021)

Had some hard crashes with the game Destiny 2, it send my mashine to hard crashes at the latest so hard that it took some 10 boot cycles to get it back on again. Tried the new driver and the previous - no matter what I do the machine crashes after 5-10 minutes at best. It seems that the latest season pack has some issues with AMD gpu and cpus. Any one else haveing these issues?


----------



## turbogear (Mar 21, 2021)

jesdals said:


> Had some hard crashes with the game Destiny 2, it send my mashine to hard crashes at the latest so hard that it took some 10 boot cycles to get it back on again. Tried the new driver and the previous - no matter what I do the machine crashes after 5-10 minutes at best. It seems that the latest season pack has some issues with AMD gpu and cpus. Any one else haveing these issues?


Sorry can't help. I don't have the Destiny 2 installed on the PC. I have it only on XBOX One X. 
I  hope some other user will be able to help you here. 

@Felix123BU I have not played SOTR since a long time now. I almost forgot that I have it on steam. 
I installed it again and ran the benchmark. You can see that on 2K monitor the average FPS with my current OCed 6800XT setting is 140 and FPS even goes much higher than my monitor refresh rate of 165Hz.


----------



## Felix123BU (Mar 21, 2021)

jesdals said:


> Had some hard crashes with the game Destiny 2, it send my mashine to hard crashes at the latest so hard that it took some 10 boot cycles to get it back on again. Tried the new driver and the previous - no matter what I do the machine crashes after 5-10 minutes at best. It seems that the latest season pack has some issues with AMD gpu and cpus. Any one else haveing these issues?


Sorry to hear that, must really suck. I don't have the game, but reading around there is a double issue, the developer and the possibly the driver, not sure if Bungie or AMD are to blame, seems Nvidia also had a ton of issues.  But the fact that you are getting hard resets probably points to some underlying general issue that's not necessarily game related. I have had 0 issues with a combo of 6800XT and Ryzen 5800X in any game.


----------



## watzupken (Mar 22, 2021)

Felix123BU said:


> I still wonder, since all here ho tried the new driver noticed a significant performance gain, why is there 0 reference about  performance gains in the driver release notes? Been last 10 years on AMD Gpu's, previous 5 with Nvidia, this is the biggest performance leap I have noticed with one driver update


May be a direct effect of this, "Among the issues fixed with Radeon Software Adrenalin 21.3.1 include high CPU utilization for Radeon Software, even when idling", just guessing. You reduce CPU utilization by the software, which may result in a slight bump in performance.


----------



## deadman3000 (Mar 22, 2021)

I have a 6700 XT reference model. Pondering playing with morepowertool to see if I can push a little more performance from it. I read that I should only touch the bottom three settings. Anyone have any 'safe' suggestions?


----------



## Felix123BU (Mar 22, 2021)

deadman3000 said:


> I have a 6700 XT reference model. Pondering playing with morepowertool to see if I can push a little more performance from it. I read that I should only touch the bottom three settings. Anyone have any 'safe' suggestions? View attachment 193356


 Congrats! Nobody really knows what "safe" settings are, but for me a maximum of plus 15% to those values would be it, and I would start incrementally, 5% extra, then 10% and see if there is a change. But keep in mind, although increasing by a bit should in theory not damage the card in any way, its a risk, small, but a risk nonetheless. I would also try to find out how much the board can maximally take, no piece of hardware is run at its absolute limits at stock, so there is always some margins to be gained potentially. Generally AMD reference boards where historically over-engineered, so they could take a lot more, but do not take this as granted for the current gen or your card, its still an open question even around here.

My 6800XT is a reference model, and the only limiting factor is power allowance, overclocking helped, raising power via MPT helped also helped. I run it with 15% over the stock power limits, but I think I will lower it to plus 10%.


----------



## turbogear (Mar 22, 2021)

Felix123BU said:


> Congrats! Nobody really knows what "safe" settings are, but for me a maximum of plus 15% to those values would be it, and I would start incrementally, 5% extra, then 10% and see if there is a change. But keep in mind, although increasing by a bit should in theory not damage the card in any way, its a risk, small, but a risk nonetheless. I would also try to find out how much the board can maximally take, no piece of hardware is run at its absolute limits at stock, so there is always some margins to be gained potentially. Generally AMD reference boards where historically over-engineered, so they could take a lot more, but do not take this as granted for the current gen or your card, its still an open question even around here.
> 
> My 6800XT is a reference model, and the only limiting factor is power allowance, overclocking helped, raising power via MPT helped also helped. I run it with 15% over the stock power limits, but I think I will lower it to plus 10%.


I have mine set at 30% of stock 255W to 332W.  
But as I showed in my previous post, my 6800XT with undervolt to 1020mV at 2745MHz it does not consume more than 290W on average on the core. At least this is the case in Assassin's creed valhalla, Cyberpunk and Borderlands 3.
I have also own Control which I have not yet tested.
I will test more games and if in all of them the core consumption remains below 300W then I will leave the high settings. My GPU only uses 332W in stress tests and benchmarks.

@deadman3000
I would also suggest similar strategy like @Felix123BU suggested to find the good balance for your card.
At certain points the power consumption is much higher than actual gained performance and after that point it is not worth to increase it further.  Keep also an eye on the thermals.


----------



## Butanding1987 (Mar 22, 2021)

I think turbogear’s 30% already includes the 15% extra power provided by Wattman (Radeon software). That means he uses a GPU power limit of 289 watts in More Power Tool. This is my card’s default setting ( without MPT). This means I get ~333w in total power. I haven’t decided yet what is optimal for my card. I was thinking maybe I should add 9w, which is roughly equivalent to the 3 stock fans that I lost since I installed a water block. Not sure yet.


----------



## turbogear (Mar 23, 2021)

Butanding1987 said:


> I think turbogear’s 30% already includes the 15% extra power provided by Wattman (Radeon software). That means he uses a GPU power limit of 289 watts in More Power Tool. This is my card’s default setting ( without MPT). This means I get ~333w in total power. I haven’t decided yet what is optimal for my card. I was thinking maybe I should add 9w, which is roughly equivalent to the 3 stock fans that I lost since I installed a water bkock. Not sure yet.


Yes 30% includes the 15% extra provided by Wattmann.
Exactly as you said, I have my reference card set to 289W in MPT.
I set it one time for testing to 295W to get higher Time Spy Score   but I don't need it for gaming as most of games does not consume that high anyways.
Many of custom cards has 289W by default,  but these cards like yours have 14-phase Vcore VRM and mine only 10-phase. 10-phase should also be good enough with decent coolig with the waterblock. 

By the way the only game that I found that consumes power like crazy is Control on maximum setting without Raytracing. 
I tested it for first time yesterday and that is the only game where my 2745MHz@1020mV OC was not good and led to game crashes. I believe my power target of 330W for core was not enough to run this game that high on clock.
In this game I had to back it up to 2675MHz@985mV. Still at the lower frequency the game was pushing 330W on the GPU core.


----------



## Butanding1987 (Mar 23, 2021)

turbogear said:


> Yes 30% includes the 15% extra provided by Wattmann.
> Exactly as you said, I have my reference card set to 289W in MPT.
> I set it one time for testing to 295W to get higher Time Spy Score   but I don't need it for gaming as most of games does not consume that high anyways.
> Many of custom cards has 289W by default,  but these cards like yours have 14-phase Vcore VRM and mine only 10-phase. 10-phase should also be good enough with decent coolig with the waterblock.
> ...


I finished that game on my 2080 Super, which I have since sold.


----------



## deadman3000 (Mar 24, 2021)

With the stock VBIOS on the reference 6700 XT overclocked to 2.8 Ghz 2130 Mhz memory @1100 Mv and 70% fan curve (15% power limit) on a 3700X (PBO 300.230,230), DDR 4 3600 Mhz XMP1 on a MSI B450M Mortar (latest BIOS and SAM enabled) I got 12116 in Time Spy. Using morepowertool I loaded up the XFX.RX6700XT.12272.210128.rom which raises the power limit from 186 watts to 211 watts and applied. After a reboot I got a score of 12328. I was also able to boost the score to 12355 by bumping the memory to 2150 Mhz but I have not tested for stability. I am now top of the leader board for people who benched on a 3700X + 6700 XT. Now I want a 5800X 

I shall wait until some others try raising their power limit higher than 211 watts before I chance doing so myself.


----------



## kapone32 (Mar 24, 2021)

I am downloading the latest driver right now. I look forward to see what this does.


----------



## Felix123BU (Mar 24, 2021)

kapone32 said:


> I am downloading the latest driver right now. I look forward to see what this does.


Would be cool if you can do a before/after comparison of performance in 1-2 games, I am really curious if you also are in awe like me from the latest driver


----------



## kapone32 (Mar 24, 2021)

Felix123BU said:


> Would be cool if you can do a before/after comparison of performance in 1-2 games, I am really curious if you also are in awe like me from the latest driver


Well I just finished running 3D mark. The crazy thing is it showed a boost clock of 2687 MHZ!!!!! I am not even sure what Game to try first but I have to pick up my daughter from Daycare so I think I will run a TWWH2 benchmark.


----------



## turbogear (Mar 30, 2021)

So I was experimenting and fine tuning my GPU further. 

One of the concerns in the past was the binding of the VRAM and VRM to the water block because the standard pads that EK provides with the GPU are 3.5W/m·K.
I have ordered EC360 Silver which are 12 W/m·K (1mm pads).

See the before and after photos from the GPU below.
I also replaced on the backplate EK pads with EC360 (1.5mm und 2mm).

For GPU chip I used as usual Thermal Grizzly Kryonaut.
The hotspot remains as expected after rebuild in the same range.
Max hotspot in Time Spy was always in range of 69°C-70°C.

Now I can push with more peace of mind the Core power draw towards 340W total. 

I don't have unfortunately a Thermal Camera to compare the temperature difference on the PCB.

I did see that now HWiNFO has in the recent versions added Vcore Temperatur measurement *(GPU VR VDDC Temperature)*.
This is hotspot temperature of the Vcore. There is 10-phase VCore on the reference 6800XT card.
These power stages are from International Rectifier TDA21472 DrMOS and are rated to -40°C to 125°C.
Max temperature reading is 50°C for me under Benchmarks.

What I also noticed is that for some reason coil whine is now better after I changed the pads. 






Here are my actual Benchmark results for the 6800XT with driver 21.3.2.
Some increase with new driver. 


With new result, I hit the third place in Leadership board for Time Spy und Port Royal sorted for GPU score for 6800XT with 5800x processor.
I could increase setting from 2745MHz@1020mV to 2750MHz@1020mV without crash. It used to crash before.

Port Royal: Search (3dmark.com)
Time Spy: Search (3dmark.com)

Result with 21.3.1 are here:








						The RX 6000 series Owners' Club
					

I'm not quite sure what to make of stability because TS and PR seem to require different settings for optimum performance. Time Spy likes high clock rates, while Port Royal hates it, at least in my case. Anyway, here are my results.




					www.techpowerup.com


----------



## Butanding1987 (Mar 31, 2021)

Any reason why you're not using 63A for TDC limit SOC?


----------



## turbogear (Mar 31, 2021)

Butanding1987 said:


> Any reason why you're not using 63A for TDC limit SOC?


Well after my recent testing I saw that for some reason my card can now still achieve same 2110MHz on VRAM even without increasing SoC TDC. 
Maybe my better cooling pads affeced that?
I have no idea but yesterday during test of 2110MHz on VRAM without TDC incease, it did not cause my score to regress. 

After changing the pads I kind of started with default MPT and went up with the parameters separately.

I will test it again anyways in the next days and see if I can go higher with VRAM frequency or still 2114MHz is my absolute limit.

I am even tempting to test if higher than 340W GPU core will give more performance.
With 340W on core, I am already out of official 375W limit of 2x8pins card as I have approximately 385W draw in total, but with VRM only hitting 50°C during benchmarks, I am tempting to test with another 10-15W.


----------



## basco (Mar 31, 2021)

interesting that there is so much diff in the time spy score but not port royal with nearly the same clocks.
congrats on the scores even if i think you are overdoing it when ya mention 5mhz difference in ram- but more power to you-du kitzelst wirklich alles raus aus der karte


----------



## turbogear (Mar 31, 2021)

basco said:


> interesting that there is so much diff in the time spy score but not port royal with nearly the same clocks.
> congrats on the scores even if i think you are overdoing it when ya mention 5mhz difference in ram- but more power to you-du kitzelst wirklich alles raus aus der karte


The 5MHz higher for GPU clock does not bring much. It only for rounding it at 2750MHz instead of 2745MHz looks nicer. 

The 72 higher Time Spy point from 20608 to 20680 are for same power target for Core and TDC. So these 72 points are coming most probably from new driver 21.3.2 vs 21.3.1 and also better cooling could have helped.
The Port Royal Score is also about 67 higher.

Ja ich glaube das ist limit von meine chip.  

In games with these settings, the power draw is not that high.
Here is test from Cyberpunk.








						The RX 6000 series Owners' Club
					

I'm not quite sure what to make of stability because TS and PR seem to require different settings for optimum performance. Time Spy likes high clock rates, while Port Royal hates it, at least in my case. Anyway, here are my results.




					www.techpowerup.com
				




Ones the new patch with Raytracing support is available, I will check how higher power consumption will go.


----------



## basco (Mar 31, 2021)

and now go rock the charts at hwbot !






						3DMark - Time Spy overclocking records @ HWBOT
					

Overclocking records




					hwbot.org


----------



## Felix123BU (Mar 31, 2021)

turbogear said:


> So I was experimenting and fine tuning my GPU further.
> 
> One of the concerns in the past was the binding of the VRAM and VRM to the water block because the standard pads that EK provides with the GPU are 3.5W/m·K.
> I have ordered EC360 Silver which are 12 W/m·K (1mm pads).
> ...


You said you noticed some increased performance, curious, how much and in what games/apps?


----------



## turbogear (Mar 31, 2021)

basco said:


> and now go rock the charts at hwbot !
> 
> 
> 
> ...


It is tempting.  
I am not a member there though. 



Felix123BU said:


> You said you noticed some increased performance, curious, how much and in what games/apps?


As mentioned in the above post the increase was in Time Spy with 72 point from 20608 to 20680 and the Port Royal Score is also about 67 higher.
I have not test the games with the new drivers yet.


----------



## Butanding1987 (Apr 1, 2021)

turbogear said:


> It is tempting.
> I am not a member there though.
> 
> 
> ...


Maybe you should test it further. Sometimes I get 70 points more in Port Royal depending on the time of day.


----------



## Athlonite (Apr 1, 2021)

Butanding1987 said:


> Maybe you should test it further. Sometimes I get 70 points more in Port Royal depending on the time of day.



I also find it helps to stand on one leg while hoping up and down and sticking out my tongue


----------



## Felix123BU (Apr 1, 2021)

Athlonite said:


> I also find it helps to stand on one leg while hoping up and down and sticking out my tongue


You guys are not up to speed, a Lisa Su enchantment will make it glow in the dark with no power needed


----------



## Felix123BU (Apr 4, 2021)

I was wondering if anybody else got their hand on a 6000 series Gpu, looking at the Steam hardware survey one would think nobody has any 6000 Gpu's, which is weird since I know at least 6 people who got their hand on one of the 6000 models, not including the venerable members of this thread


----------



## Space Lynx (Apr 4, 2021)

Felix123BU said:


> I was wondering if anybody else got their hand on a 6000 series Gpu, looking at the Steam hardware survey one would think nobody has any 6000 Gpu's, which is weird since I know at least 6 people who got their hand on one of the 6000 models, not including the venerable members of this thread



the survey is flawed imo, they should just send it to all steam users. it literally takes 5 seconds.  its w.e i dont care anymore


----------



## Felix123BU (Apr 4, 2021)

lynx29 said:


> the survey is flawed imo, they should just send it to all steam users. it literally takes 5 seconds.  its w.e i dont care anymore


Yup, in 10 years, incidentally only AMD Gpus the last 10 years, have not had one single survey   
Would not go full conspiracy theory, but as you say, if they would push this on all clients .... probably would be the easiest way to estimate how many where produced by both Nvidia and AMD and how many got in the hands of gamers.
I love my 6800XT, wish all could have one


----------



## Space Lynx (Apr 4, 2021)

Felix123BU said:


> Yup, in 10 years, incidentally only AMD Gpus the last 10 years, have not had one single survey
> Would not go full conspiracy theory, but as you say, if they would push this on all clients .... probably would be the easiest way to estimate how many where produced by both Nvidia and AMD and how many got in the hands of gamers.
> I love my 6800XT, wish all could have one




there is no reason NOT to give the survey to all steam users... its the 21st century people not snail mail... haha


----------



## kapone32 (Apr 5, 2021)

The problem with Steam is you can have your account tied to all of your PCs. I wonder which one it picks to determine your position?


----------



## Space Lynx (Apr 5, 2021)

kapone32 said:


> The problem with Steam is you can have your account tied to all of your PCs. I wonder which one it picks to determine your position?



Probably whichever one you login to first. /shrug


----------



## turbogear (Apr 5, 2021)

Felix123BU said:


> I was wondering if anybody else got their hand on a 6000 series Gpu, looking at the Steam hardware survey one would think nobody has any 6000 Gpu's, which is weird since I know at least 6 people who got their hand on one of the 6000 models, not including the venerable members of this thread


I have not played any steam game in last months  since I have got 6800XT.
So mine will not be listed there anyways.


----------



## sepheronx (Apr 5, 2021)

I got a particular issue with my 6800xt.

I found that when playing games, core speed drops a lot and causes stuttering.  I'll play let's say Witcher 3 and the GPU usage will randomly drop to 60% and the core speed drops to 500mhz, then shoots back up.  It does this very regularly.


----------



## Space Lynx (Apr 5, 2021)

sepheronx said:


> I got a particular issue with my 6800xt.
> 
> I found that when playing games, core speed drops a lot and causes stuttering.  I'll play let's say Witcher 3 and the GPU usage will randomly drop to 60% and the core speed drops to 500mhz, then shoots back up.  It does this very regularly.



never happened to me and I have played witcher 3 with my 6800 non-xt many times.

try exclusive fullscreen mode or borderless window back and forth. i have heard that game has weird issues sometimes with settings


----------



## Felix123BU (Apr 5, 2021)

sepheronx said:


> I got a particular issue with my 6800xt.
> 
> I found that when playing games, core speed drops a lot and causes stuttering.  I'll play let's say Witcher 3 and the GPU usage will randomly drop to 60% and the core speed drops to 500mhz, then shoots back up.  It does this very regularly.


setting a minimal frequency in Wattman (2000mhz for starters)  might help, this happens with old titles, even though W3 is by no means that old


----------



## sepheronx (Apr 5, 2021)

I'll give that a try. I looked on Reddit and had plenty of people with same issues. It isn't really just Witcher 3 either. It's RE2 or 3 remake, Rage 2, etc. Only game it ran at full use was Tropical 6.

I was also told to use profile manager for all my games. Wow.


----------



## Felix123BU (Apr 5, 2021)

sepheronx said:


> I'll give that a try. I looked on Reddit and had plenty of people with same issues. It isn't really just Witcher 3 either. It's RE2 or 3 remake, Rage 2, etc. Only game it ran at full use was Tropical 6.
> 
> I was also told to use profile manager for all my games. Wow.


I cant tell how you installed the drivers, so will not make assumptions, but generally its always best before installing a driver to make sure any old one is completely removed (DDU in safe mode is good at that).
The only game that had the issue you described was Gothic 3, but that's really old, and I "fixed it" with having my main profile at 2.5ghz minimum (OC-ed to 2.7Ghz), so basically have the minimum frequency 100-200Mhz below the max, and I am speaking of a general profile, you also can make game specific profiles, but that's a waste of time in my opinion  

something like the below


----------



## sepheronx (Apr 5, 2021)

Way ahead of you regarding drivers. Been doing this for over 20 years.  Still gets me that there isn't anything built in and have to rely on DDU.

But I'll give a try to min clock speeds. Hopefully that helps.

Thanks brother.


----------



## Felix123BU (Apr 5, 2021)

sepheronx said:


> Way ahead of you regarding drivers. Been doing this for over 20 years.
> 
> But I'll give a try to min clock speeds. Hopefully that helps.
> 
> Thanks brother.


Just thought of another possible issue that would create these type of problems, but not sure if it can be applied to you, there where cases of Windows power management being overzealous in clocking down the PCI-E lanes which caused GPU slowdowns, switching to a highest Performance power plan or equivalent (and making sure the sub settings in that profile do not allow power saving for PCI-E - Link state power management) could also be worth trying, assuming you don't run it at highest already, in that case min frequency should "fix" it


----------



## sepheronx (Apr 5, 2021)

Felix123BU said:


> Just thought of another possible issue that would create these type of problems, but not sure if it can be applied to you, there where cases of Windows power management being overzealous in clocking down the PCI-E lanes which caused GPU slowdowns, switching to a highest Performance power plan or equivalent (and making sure the sub settings in that profile do not allow power saving for PCI-E - Link state power management) could also be worth trying, assuming you don't run it at highest already, in that case min frequency should "fix" it


I'll also check. I can't remember if I enabled It or not.

It's just sad I see so many others online at amd site and Reddit with same issue as me. Figured AMD would have fixed their driver issues by now. No issue if this issue on my 3070.


----------



## Felix123BU (Apr 5, 2021)

sepheronx said:


> I'll also check. I can't remember if I enabled It or not.
> 
> It's just sad I see so many others online at amd site and Reddit with same issue as me. Figured AMD would have fixed their driver issues by now. No issue if this issue on my 3070.


Its always bad when you get a new piece of hardware and it does not perform how  it should. What I can say for me personally, the drivers with the 6800XT have been the most trouble-free drivers ever, rock solid with very minor exceptions. But there are so many possible combinations of hardware and software settings, sometimes you end up on the trouble side of things. Another thing you could do is report the issue to AMD via the AMD bug report tool I think its called, should be inside wattman. I reported one minor bug, the OSD metrics sometime not remembering their custom settings and resetting to default settings, and next driver that was in the fixed list, I was like damn, I could not find anybody else having that issue on the internet


----------



## sepheronx (Apr 5, 2021)

Felix123BU said:


> Its always bad when you get a new piece of hardware and it does not perform how  it should. What I can say for me personally, the drivers with the 6800XT have been the most trouble-free drivers ever, rock solid with very minor exceptions. But there are so many possible combinations of hardware and software settings, sometimes you end up on the trouble side of things. Another thing you could do is report the issue to AMD via the AMD bug report tool I think its called, should be inside wattman. I reported one minor bug, the OSD metrics sometime not remembering their custom settings and resetting to default settings, and next driver that was in the fixed list, I was like damn, I could not find anybody else having that issue on the internet








						RX 6800 XT Low GPU Usage
					

GPU: RX 6800 XT Sapphire Pulse (Fresh from the store) CPU: Ryzen 7 3800X RAM: 16GB 2666Mhz DDR4 Motherboard: Asrock B550AM Gaming PSU: Seasonic 1000W   I'm playing in 1080p and I'm getting very low performance out of this card, I got more fps with my old RX 5700 XT. My GPU usage jumps around a...




					community.amd.com
				





__
		https://www.reddit.com/r/radeon/comments/km7bbz


__
		https://www.reddit.com/r/AMDHelp/comments/lp72f7/_/gt8fp5k

Same issue


----------



## Felix123BU (Apr 5, 2021)

sepheronx said:


> RX 6800 XT Low GPU Usage
> 
> 
> GPU: RX 6800 XT Sapphire Pulse (Fresh from the store) CPU: Ryzen 7 3800X RAM: 16GB 2666Mhz DDR4 Motherboard: Asrock B550AM Gaming PSU: Seasonic 1000W   I'm playing in 1080p and I'm getting very low performance out of this card, I got more fps with my old RX 5700 XT. My GPU usage jumps around a...
> ...


Read most of it, there can be a lot of potential issues, CPU overheating and clocking down massively which would starve the GPU (I would check CPU temps while gaming, if it gets way hot, that would be your issue and solution aswell), RAM errors, power supply issues, even a faulty GPU, or even driver related, but since these issues only appear for a small amount of people my guess would be a bottleneck somewhere in the system, and that can come in many forms and flavors. Worked for 10 years in hardware troubleshooting, have seen a loooot of issues over time that are cause by hardware problems and could be solved relatively easy. (funny enough, CPU overheating was a recurring thing   )

Anyways, I hope you find a solution, these GPUs are soo good when working properly


----------



## sepheronx (Apr 5, 2021)

Felix123BU said:


> Read most of it, there can be a lot of potential issues, CPU overheating and clocking down massively which would starve the GPU (I would check CPU temps while gaming, if it gets way hot, that would be your issue and solution aswell), RAM errors, power supply issues, even a faulty GPU, or even driver related, but since these issues only appear for a small amount of people my guess would be a bottleneck somewhere in the system, and that can come in many forms and flavors. Worked for 10 years in hardware troubleshooting, have seen a loooot of issues over time that are cause by hardware problems and could be solved relatively easy. (funny enough, CPU overheating was a recurring thing   )
> 
> Anyways, I hope you find a solution, these GPUs are soo good when working properly


In shadow of the tomb Raider and some other titles it's running at full speed and no issue at all. It is a solid performer.  It's just picky.

Will play around with it this weekend.


----------



## sepheronx (Apr 7, 2021)

Update: So Felix, I took your suggestion and moved the performance sliders around:





Anyway, low and behold, it more or less fixed my issue.  I know that playing games at 1080p on a card that is meant for high 1440p to mid 4K isnt stressing the card at all and preventing it from being used at full potential, but this power save is nonsense and caused pretty much all my stuttering.  Seems good now.  Will make adjustments to performance later as I heard that these 6800xt's are solid overclockers giving it pretty much on par performance with 6900xt's.


----------



## Felix123BU (Apr 7, 2021)

sepheronx said:


> Update: So Felix, I took your suggestion and moved the performance sliders around:
> View attachment 195598
> 
> 
> Anyway, low and behold, it more or less fixed my issue.  I know that playing games at 1080p on a card that is meant for high 1440p to mid 4K isnt stressing the card at all and preventing it from being used at full potential, but this power save is nonsense and caused pretty much all my stuttering.  Seems good now.  Will make adjustments to performance later as I heard that these 6800xt's are solid overclockers giving it pretty much on par performance with 6900xt's.


Nice, glad you managed to fix it, always a shame to get new hardware that does not work as expected, especially such illusive items in this day and age like a GPU..  
And yeah, the 6800XT's are mostly great OC-ers, that is if you manage to keep them cool while overclocking (power mods help a lot, but more power, more heat),  I myself managed to get it to something like 2.5 on the stock reference cooler, and now at 2.7 with a waterblock on it, and yup yup, its basically a bit better than a stock 6900XT-ish. One thing I can say about performance, first I had it with my old 3800X, then moved to a 5800X, the jump in overall GPU perf was greater than expected, it was sweet with the 3800X, became something else with the 5800X 
Also, AMD should know better, the boost should be on for every resolution no matter how much the GPU is actually used, its better with this series, the Vega64 I had before had even more issues boosting high in very low demand games... at least with the 6000 series, if you put in a minima frequency, its actually applied


----------



## sepheronx (Apr 7, 2021)

This Gigabyte OC card doesn't run very hot at all.  Actually, I believe it uses its windforce based fans and those are apparently pretty premium.  And a good aluminum backplate.

So I hope to get a decent OC out of it.  The case I use is actually the MSI MPG 110M and I got 3 InWin Saturn fans in the front, those things push like 70CFM so temps are good.  So if I can get a decent OC, I will be sitting gold with this card for a long time.  Maybe find an AIO that will work for it (I Dont care to go full water cooling setup).


----------



## Felix123BU (Apr 8, 2021)

Not bad   

RX 6800 XT Midnight Black








						AMD to launch Radeon RX 6800 XT "Midnight Black" today - VideoCardz.com
					

AMD to launch Radeon RX 6800 XT in black AMD is making a surprising announcement today.  At approximately 6 AM PST today, AMD will unveil its new black edition of the Radeon RX 6800 XT. The manufacturer has sent out a special notice to its Red Team member, from whom we received a tip. The […]




					videocardz.com


----------



## Butanding1987 (Apr 8, 2021)

Felix123BU said:


> Not bad
> 
> RX 6800 XT Midnight Black
> 
> ...


Out of stock at launch.


----------



## Felix123BU (Apr 8, 2021)

Butanding1987 said:


> Out of stock at launch.


But of course, its a GPU    No such thing as in stock GPU


----------



## turbogear (Apr 8, 2021)

Felix123BU said:


> Not bad
> 
> RX 6800 XT Midnight Black
> 
> ...


For us it does not make difference if it is black cooler or silver, we would have turned ours anyway to water cooling.  
We will have black cooler sitting in the closet rather than a silver one. 

I thought I may get it this new card that you mention from AMD. I was at AMD website at launch time and even before I can put it into basket it was gone. 

My current 6800XT is dying  with the *ESD shock* that it received last week.
It has bad stutter now. I think eventually it will completely die.
I was thinking  to put original cooler on it and send it for repair. No chance for that as warranty void sticker is obviously missing as I had it under water cooler. 
That is the risk one takes when one converts his GPU to water cooling that warranty would be void and if for some reason the card need to be returned then you are screwed.

Good that in USA now these sticker are not valid anymore.
In Germany they are still valid and seller told me I can keep the card no luck with warranty when sticker is not there and cooler was removed by me. 

As no luck with repair and I need a new GPU. I ordered a new 6800XT Red Devil from Mindfactory.
Hopefully I will get it until next week. 
It was supposed to arrive this week but the DHL idiots screwed up claiming my address does not exist and returned the card back to Mindfactory without trying second time to deliver it. 
Mindfactory will send it again to me ones it is back at their place.
Sometime one has a very shitty week I have to say.


----------



## GamerGuy (Apr 9, 2021)

@turbogear , Oh man, that's so messed up! Hope you get the replacement card asap. The warranty sticker thing does apply to me in my neck of the woods, temper with the stickers and you void the warranty. That's one of the reasons why I'd gotten a Nitro+ RX 6800, which was replaced by the Nitro+ RX 6900 XT. 

Would have loved to try out an LC'ed RX 6800XT/6900XT but the warranty issue was the deciding factor....that, and my lack of confidence with regards to application of the thermal pads (not sure of thickness and size) and the overall uncertainty I feel with LC. Guess I'm too old skool, besides, I'm something of an old fart, and I sorta personify the old saying "you can't teach an old dog new tricks" .


----------



## Butanding1987 (Apr 9, 2021)

How did it happen?


turbogear said:


> My current 6800XT is dying  with the *ESD shock* that it received last week.
> It has bad stutter now. I think eventually it will completely die.


----------



## turbogear (Apr 9, 2021)

GamerGuy said:


> @turbogear , Oh man, that's so messed up! Hope you get the replacement card asap. The warranty sticker thing does apply to me in my neck of the woods, temper with the stickers and you void the warranty. That's one of the reasons why I'd gotten a Nitro+ RX 6800, which was replaced by the Nitro+ RX 6900 XT.
> 
> Would have loved to try out an LC'ed RX 6800XT/6900XT but the warranty issue was the deciding factor....that, and my lack of confidence with regards to application of the thermal pads (not sure of thickness and size) and the overall uncertainty I feel with LC. Guess I'm too old skool, besides, I'm something of an old fart, and I sorta personify the old saying "you can't teach an old dog new tricks" .


I was unlucky this time.   
I have been using LC on my cards since many Radeon generations and never had to return a card before because of warranty topic.

I may still do it on the new card but I will contact PowerColor to check their stand on changing solder paste. 
My card that died is from XFX. If you read their USA warranty site, they allow changing the cooler but one should contact them directly before. I did not do that and in Germany they don't have direct RMA option. 
You need to go through Shop where you bought it and the shop told me as sticker is broken warranty is lost.


----------



## ratirt (Apr 9, 2021)

turbogear said:


> For us it does not make difference if it is black cooler or silver, we would have turned ours anyway to water cooling.
> We will have black cooler sitting in the closet rather than a silver one.
> 
> I thought I may get it this new card that you mention from AMD. I was at AMD website at launch time and even before I can put it into basket it was gone.
> ...


Dude. What did you do to the card? ESD shock why didn't you wear grounding band or something? Did you get carried away with the water block and set up while changing the cooler? 
Man I only hope that's not the case and maybe there is some excessive force on the components and you can loosen it up a bit? I'd still try warranty and explain you put a water block etc. to preserve the cards life. If you get the new card, will the block you already have fit the red devil? Not likely.


----------



## turbogear (Apr 9, 2021)

Butanding1987 said:


> How did it happen?


I was filling the water loop as it was missing some liquid.
The computer was on.
Unlucky for me my 5 year old daughter was playing near me and she touched my arm and I had my hand touching the GPU.
Me, her and GPU got an ESD shock. The computer rebooted instantly.
Ever since that insident the GPU is stuttering in 3D applications.
It is like you press a puase botton for a fraction of a second.
In the clock sharts you see that memory and GPU clock drops down to a few MHz very briefly.

Usually I am very careful with ESD and discharge myself before touching components. When weather is cold outside and heaters are running sometimes we have humidity dropping inside my house to less than 40%. That was the case on that day. When humidity is low ESD is a problem especially if one is wearing shoes with plastic sole.
I usually wear ESD protection shoes when doing electronic work  but sometimes one has just bad luck on a certain day and things go wrong.


----------



## ratirt (Apr 9, 2021)

turbogear said:


> I was filling the water loop as it was missing some liquid.
> The computer was on.
> Unlucky for me my 5 year old daughter was playing near me and she touched my arm and I had my hand touching the GPU.
> Me, her and GPU got an ESD shock. The computer rebooted instantly.
> ...


It doesn't have to be the card. If the card was in the computer and you touched the computer not the card. Have you seen Jayz video? They were doing the ESD Shocks on purpose to the computer. It restarted shut off and it was still working. Maybe you need to sit down and pull all parts out put it all back together. I would try it. Maybe it is not the card but something else?


----------



## basco (Apr 9, 2021)

you can get the stickers of with a heat gun on low and softly push from beneath-try not to pull.
i use something like this:


----------



## turbogear (Apr 9, 2021)

basco said:


> you can get the stickers of with a heat gun on low and softly push from beneath-try not to pull.


Actually I tried that when I took the card apart. I have hot gun and heated the stickers.
One went off without issue and I still have that intact. The other one has developed a crack.
According to seller if the sticket is cracked the warranty does not exis anymore.


----------



## basco (Apr 9, 2021)

often they are the same on cheap cards even from other vendors

i know its not easy-takes a long time to get them off safely and i have shaking hands.

i too think the card should be good even after the esd-i really hope so for ya


----------



## ratirt (Apr 9, 2021)

turbogear said:


> Actually I tried that when I took the card apart. I have hot gun and heated the stickers.
> One went off without issue and I still have that intact. The other one has developed a crack.
> According to seller if the sticket is cracked the warranty does not exis anymore.


Hold on. The seller should get the card and send it back for testing/repair. They are not fixing anything so they shouldn't say the warranty does not apply. The manufacturer decides if the warranty has been voided or not.


----------



## turbogear (Apr 9, 2021)

ratirt said:


> It doesn't have to be the card. If the card was in the computer and you touched the computer not the card. Have you seen Jayz video? They were doing the ESD Shocks on purpose to the computer. It restarted shut off and it was still working. Maybe you need to sit down and pull all parts out put it all back together. I would try it. Maybe it is not the card but something else?


Already did that in the past days.
First suspension was power supply.
I took it out and put the one from my son inside my computer but did not help.
Then I took my GPU and put it into his computer and the problem went with the GPU to new computer so it should be the GPU.


----------



## sepheronx (Apr 9, 2021)

turbogear said:


> Already did that in the past days.
> First suspension was power supply.
> I took it out and put the one from my son inside my computer but did not help.
> Then I took my GPU and put it into his computer and the problem went with the GPU to new computer so it should be the GPU.


Are you sure it's the GPU itself and not AMD drivers?

My 6800xt was doing same thing till I had to force clock speeds.

If all is lost, RMA it.


----------



## ratirt (Apr 9, 2021)

turbogear said:


> Already did that in the past days.
> First suspension was power supply.
> I took it out and put the one from my son inside my computer but did not help.
> Then I took my GPU and put it into his computer and the problem went with the GPU to new computer so it should be the GPU.


I'm just saying, If you haven't ESD shocked the GPU while for example changing the cooler, but the shock happened while all components were in the computer, it doesn't necessarily mean the GPU is broken. 
One time it happened to me and PSU was in fact damaged some how. Swapping it solved nothing. I struggled for a while with this and decided to reinstall windows. All problems were gone. 
If I were you, I would try some stuff before blaming it all on the GPU just because you have stutter or glitches.



sepheronx said:


> Are you sure it's the GPU itself and not AMD drivers?
> 
> My 6800xt was doing same thing till I had to force clock speeds.
> 
> If all is lost, RMA it.


Exactly. I wouldn't say it is GPU due to stutter in games. I would rather look for an issue to be 100% sure it's that. In my case it wasn't the GPU even though the computer behaved weird.


----------



## turbogear (Apr 9, 2021)

ratirt said:


> Hold on. The seller should get the card and send it back for testing/repair. They are not fixing anything so they shouldn't say the warranty does not apply. The manufacturer decides if the warranty has been voided or not.


I am not too sure about that, but I have it on my mind to put the original cooler on it and put the stickers on screws and send it anyways to them.
In best case they will not notice the crack in the sticker and repair the card. In worst  case I get it back and pay some money for them to post it to me.  

I want to try it after I have the new GPU in my computer otherwise I have no running computer.
If I get lucky and they do repair it, I can always sell it.


----------



## ratirt (Apr 9, 2021)

turbogear said:


> I am not too sure about that, but I have it on my mind to put the original cooler on it and put the stickers on screws and send it anyways to them.
> In best case they will not notice the crack in the sticker and repair the card. In worst  case I get it back and pay some money for them to post it to me.
> 
> I want to try it after I have the new GPU in my computer otherwise I have no running computer.
> If I get lucky and they do repair it, I can always sell it.


Try finding the problem or confirming that this is the GPU for sure. It's hard for me to believe it is since the shock happened when the components were installed in the computer.
I would pull everything apart, put it all back together (maybe change some parts like PSU, ram etc.) and reinstall windows. Basically start from scratch and then try. No OC on any component. All stock and then give it a shot.


----------



## turbogear (Apr 9, 2021)

sepheronx said:


> Are you sure it's the GPU itself and not AMD drivers?
> 
> My 6800xt was doing same thing till I had to force clock speeds.
> 
> If all is lost, RMA it.


I am pretty sure it is not the driver.
I am running Time Spy very often. I had it run on that day as well before the incident and the was never a stutter in Time Spy.
I did it right away after the incident and noticed the very bad stutter.
Now I start Time Spy and every few minutes the benchmark is stuttering and GPU clock and memory frequency is dropiing.



ratirt said:


> Try finding the problem or confirming that this is the GPU for sure. It's hard for me to believe it is since the shock happened when the components were installed in the computer.
> I would pull everything apart, put it all back together (maybe change some parts like PSU, ram etc.) and reinstall windows. Basically start from scratch and then try. No OC on any component. All stock and then give it a shot.


I did many things already, including going back to older bios on my board the one without the USB fix from AMD.

As I said, I moved the card to another computer. I did DDU on that computet before to remove all the NVIDIA drivers from  that computer.
Installed the card and ran tests and stuttering moved to the new computer.
I also put my son's GPU on mine GTX 1070Ti and there was no stutter on mine for many hours of test.


----------



## sepheronx (Apr 9, 2021)

turbogear said:


> I am pretty sure it is not the driver.
> I am running Time Spy very often. I had it run on that day as well before the incident and the was never a stutter in Time Spy.
> I did it right away after the incident and noticed the very bad stutter.
> Now I start Time Spy and every few minutes the benchmark is stuttering and GPU clock and memory frequency is dropiing.


I'm sorry to hear that.  Just give amd control panel adjustments first. If nothing, yeah, RMA. And also keep us updated including your RMA experience.


----------



## ratirt (Apr 9, 2021)

turbogear said:


> I am pretty sure it is not the driver.
> I am running Time Spy very often. I had it run on that day as well before the incident and the was never a stutter in Time Spy.
> I did it right away after the incident and noticed the very bad stutter.
> Now I start Time Spy and every few minutes the benchmark is stuttering and GPU clock and memory frequency is dropiing.
> ...


You have to keep in mind that the 1070TI uses way less power. Sometimes it just doesn't make sense how components act in a system. If you are sure it's the GPU try RMA and call it a day. 
BTW, how much did you pay for the Red Devil 6900XT? Curious what the price is now in Deutschland.


----------



## Butanding1987 (Apr 9, 2021)

turbogear said:


> I was filling the water loop as it was missing some liquid.
> The computer was on.
> Unlucky for me my 5 year old daughter was playing near me and she touched my arm and I had my hand touching the GPU.
> Me, her and GPU got an ESD shock. The computer rebooted instantly.
> ...


Which part of the GPU were you touching? I mean, it's covered by the block and it even has a backplate. How could it have been damaged by that?


----------



## turbogear (Apr 9, 2021)

ratirt said:


> You have to keep in mind that the 1070TI uses way less power. Sometimes it just doesn't make sense how components act in a system. If you are sure it's the GPU try RMA and call it a day.
> BTW, how much did you pay for the Red Devil 6900XT? Curious what the price is now in Deutschland.


I did not order 6900XT but 6800XT Red Devil. It cost me 1299€.  

Ones the new card arrives, that would be another test for the computer.
If I notice that there is still strange behavior with the new 6800XT then I will need to debug further but at the moment I am pretty sure it is my GPU that is effected.

My debugging so far leads me to believe that VRAM got the hit.
If the VRAM is at default frequency the stuttering is there but does not happen that frequently and if I increase it to 2050MHz the stuttering becomes more dominant. If I put it to 2100MHz where it was before the insident happened it stutters every 2 to 3 minute when running for example Furhmark you see lot of dips.
Before the incident, my VRAM was at 2100MHz for months and no stuttering.

I tried 6900XT Red Devil as you remember about 1.5 months ago and was not happy with it's performance advantage over 6800XT and returned that one at time. I would not ike to pay 1650€ for the 6900XT Red Devil for only small performance gain as headroom for OC is usually on 6800XT much better.



Butanding1987 said:


> Which part of the GPU were you touching? I mean, it's covered by the block and it even has a backplate. How could it have been damaged by that?


The back plate on EK block is made out of Aluminum. There is no color on it. It is silver. My hand was touching it. There are no isolation plastic washers between backplate and PCB screw holes.
Around the screw holes on GPU if you look at the naked GPU there is a contact ring and I believe this is contacting the ground plane on the GPU.









						EK-Quantum Vector RX 6800/6900 D-RGB - AMD Radeon Edition
					

EK-Quantum Vector RX 6800/6900 is a special edition 2nd generation Vector GPU water block from the EK® Quantum Line. It is made for graphics cards based on the latest AMD® RDNA2™ architecture. This water block fits most reference PCB designs of the Radeon RX 6800, RX 6800XT, and RX 6900 GPUs...




					www.ekwb.com


----------



## ratirt (Apr 9, 2021)

turbogear said:


> I did not order 6900XT but 6800XT Red Devil. It cost me 1299€.
> 
> Ones the new card arrives, that would be another test for the computer.
> If I notice that there is still strange behavior with the new 6800XT then I will need to debug further but at the moment I am pretty sure it is my GPU that is effected.
> ...


Oh the 6800xt. I got it wrong buddy. I thought you wanna pull all the stops and go 6900XT. 
I remember you had a 6900Xt red devil but you were not satisfied with the performance. 

Try stuff and find the problem but like I said before, It is hard for me to believe it is the card since all the components were in the computer connected and the shock (I assume) was to the case not specific component. When you say Vram it's specific and even more unbelievable but I won't judge. Maybe you are right but it would have been weird if that turned out to be true.


----------



## turbogear (Apr 9, 2021)

ratirt said:


> Try stuff and find the problem but like I said before, It is hard for me to believe it is the card since all the components were in the computer connected and the shock (I assume) was to the case not specific component. When you say Vram it's specific and even more unbelievable but I won't judge. Maybe you are right but it would have been weird if that turned out to be true.


It could be some power supply component supplying VRAM as well. 
Let's see. I will do further analysis on weekend and if I find something new I will let you know.



basco said:


> you can get the stickers of with a heat gun on low and softly push from beneath-try not to pull.
> i use something like this:
> View attachment 195910


Thank for the hint. 
I did it with hot gun and a scalpel.
I got one sticker off very well but got a crack on the other sticker.
One sticker says AMD on it and the other one says XFX.
The XFX sticker is intakt but AMD one was really big and got a crack while removing.


----------



## lowrider_05 (Apr 9, 2021)

turbogear said:


> It could be some power supply component supplying VRAM as well.
> Let's see. I will do further analysis on weekend and if I find something new I will let you know.


Hello, i would only like to remind you, since you live in Germany, that you need to remind the Shop were you bought the GPU from, that you send the Card back under "Gewährleistung" and not Warrenty because all the Manufacturers policies don´t apply to this. If the GPU is under 6 Months old, just say it does not work anymore and you don´t know why, because they then have to prove that you killed the card or well, repair/exchange it for you.

maybe that helps.


----------



## turbogear (Apr 9, 2021)

lowrider_05 said:


> Hello, i would only like to remind you, since you live in Germany, that you need to remind the Shop were you bought the GPU from, that you send the Card back under "Gewährleistung" and not Warrenty because all the Manufacturers policies don´t apply to this. If the GPU is under 6 Months old, just say it does not work anymore and you don´t know why, because they then have to prove that you killed the card or well, repair/exchange it for you.
> 
> maybe that helps.


Thanks for the hint I will try to play ignorant and send the card back as malfunctioning  hardware and don't know what has happened. 
I will put warranty stickers back and if I am lucky they will not complain about the crack.  

The card is less than 6 months old.


----------



## Felix123BU (Apr 9, 2021)

GamerGuy said:


> @turbogear , Oh man, that's so messed up! Hope you get the replacement card asap. The warranty sticker thing does apply to me in my neck of the woods, temper with the stickers and you void the warranty. That's one of the reasons why I'd gotten a Nitro+ RX 6800, which was replaced by the Nitro+ RX 6900 XT.
> 
> Would have loved to try out an LC'ed RX 6800XT/6900XT but the warranty issue was the deciding factor....that, and my lack of confidence with regards to application of the thermal pads (not sure of thickness and size) and the overall uncertainty I feel with LC. Guess I'm too old skool, besides, I'm something of an old fart, and I sorta personify the old saying "you can't teach an old dog new tricks" .


You could classify me also as some sort of an older fart and also reasonably conservative, and after maaany years I went full LC for the first time, not only was it lots of fun but the whole thing is cooler and quieter. In conclusion, you just need to find the right treat for an old dog to make him learn new tricks


----------



## sepheronx (Apr 10, 2021)

lol, I think my 6800xt is being bottlenecked by my 10500ES



OC (Mem: 2040 Core: 2400)


----------



## Felix123BU (Apr 10, 2021)

sepheronx said:


> lol, I think my 6800xt is being bottlenecked by my 10500ES
> View attachment 196094
> OC (Mem: 2040 Core: 2400)
> View attachment 196095


"bottlenecked by my 10500ES" not so much that it would matter a lot, yeah, if you aim for potentially getting 100% out of the card, true, but if you are getting 95% out of it already ....  with possible exceptions of games that are coded for 8+ cores and run worse on 6, but those are very rare cases.
What is mostly holding your scores back is the 2400 Mhz core clock and 2040 mem clock, the 6800XT generally boost that much even with stock settings if it stays cool enough, you can try 2600 Mhz for core and 2090 for memory, and some conservative 1090mv for voltage, see how it goes and watch the temps. But to get to 20k graphic scores, you need soft power modding 

And 18k graphics score for 2400 Mhz core and 2040 mem is not bad at all, its actually good .


----------



## sepheronx (Apr 10, 2021)

Felix123BU said:


> "bottlenecked by my 10500ES" not so much that it would matter a lot, yeah, if you aim for potentially getting 100% out of the card, true, but if you are getting 95% out of it already ....  with possible exceptions of games that are coded for 8+ cores and run worse on 6, but those are very rare cases.
> What is mostly holding your scores back is the 2400 Mhz core clock and 2040 mem clock, the 6800XT generally boost that much even with stock settings if it stays cool enough, you can try 2600 Mhz for core and 2090 for memory, and some conservative 1090mv for voltage, see how it goes and watch the temps. But to get to 20k graphic scores, you need soft power modding
> 
> And 18k graphics score for 2400 Mhz core and 2040 mem is not bad at all, its actually good .


I figured that the card is as average as average gets.  At stock settings I get more or less what seems to be the standard score.

Stupid question though, does timespy still run at the 1440p stock even if my monitor is 1080p?


----------



## Felix123BU (Apr 10, 2021)

sepheronx said:


> I figured that the card is as average as average gets.  At stock settings I get more or less what seems to be the standard score.
> 
> Stupid question though, does timespy still run at the 1440p stock even if my monitor is 1080p?


I would not make judgements on how good the card is before I try to push it to the limit  At stock a 6800XT should be around 16500 in Timespy, you getting 18000 with a medium OC is pretty good.
As far as I remember, yes, Time Spy runs at 1440p if you don't manually select a custom resolution regardless of monitor resolution.


----------



## sepheronx (Apr 10, 2021)

Felix123BU said:


> I would not make judgements on how good the card is before I try to push it to the limit  At stock a 6800XT should be around 16500 in Timespy, you getting 18000 with a medium OC is pretty good.
> As far as I remember, yes, Time Spy runs at 1440p if you don't manually select a custom resolution regardless of monitor resolution.


First score is stock.

I'll do further oc later


----------



## lciarlo (Apr 10, 2021)

Just ordered the Gigabyte 6700 xt from newegg shuffle. should be arriving next week
planning on pairing it with a 5600x, any thoughts.


----------



## sepheronx (Apr 10, 2021)

lciarlo said:


> Just ordered the Gigabyte 6700 xt from newegg shuffle. should be arriving next week
> planning on pairing it with a 5600x, any thoughts.


How much?

My thoughts are, enjoy it.


----------



## Felix123BU (Apr 10, 2021)

lciarlo said:


> Just ordered the Gigabyte 6700 xt from newegg shuffle. should be arriving next week
> planning on pairing it with a 5600x, any thoughts.


Its a good paring, you will most likely enjoy it


----------



## lciarlo (Apr 10, 2021)

sepheronx said:


> How much?
> 
> My thoughts are, enjoy it.


$749. Way too much. But every card on the market is way to much


----------



## Space Lynx (Apr 10, 2021)

lciarlo said:


> $749. Way too much. But every card on the market is way to much



yep, still can't believe I got my 6800 for $579 msrp. i probably won't get that lucky again for as long as i live.


----------



## lciarlo (Apr 10, 2021)

lynx29 said:


> yep, still can't believe I got my 6800 for $579 msrp. i probably won't get that lucky again for as long as i live.


the only place you can find msrp right now is on amd.com. but of course very hard to find one available , always sold out. I got 2 3070's on the evga queue, paid more than the original msrp but not a whole lot.


----------



## sepheronx (Apr 10, 2021)

lciarlo said:


> the only place you can find msrp right now is on amd.com. but of course very hard to find one available , always sold out. I got 2 3070's on the evga queue, paid more than the original msrp but not a whole lot.


I got a few 3080's on queue at EVGA. I don't expect to get a notification till at least 2026.


----------



## lciarlo (Apr 10, 2021)

sepheronx said:


> I got a few 3080's on queue at EVGA. I don't expect to get a notification till at least 2026.


believe it or not, in January i got 2 3070's off that queue


----------



## Space Lynx (Apr 10, 2021)

lciarlo said:


> believe it or not, in January i got 2 3070's off that queue



my buddy got a 3080 off the evga que, but he signed up like the day it was announced.


----------



## sepheronx (Apr 11, 2021)

I have an odd issue.

So I have a gaming and mining profile in AMD software.

So I have been having a situation that I'll play a game then when I'm done, I load the mining profile and run nicehash.  It will run fine for short bit then (it is normal at 61MHs) and then poof, it ramps up speed of fan. Then back down. But now my hash rates drop to 30MHs.  I then close the mining. Load mining profile again and away I go. But now it ends up running hot and the fan curve profile refuses to work.  Have to force a constant percentage which doesn't even reflect to reality (set it to 60% but it's actually running 80%).

I end up having to reboot the PC, load mining profile and run it.  It works fine then.

But if I load the gaming profile, game, then load mining profile, I'll have problems again.

Is it just crappy bug in the profile/software?


----------



## Space Lynx (Apr 11, 2021)

sepheronx said:


> I have an odd issue.
> 
> So I have a gaming and mining profile in AMD software.
> 
> ...



why not just use msi afterburner and set a permanent fan speed?


----------



## sepheronx (Apr 11, 2021)

lynx29 said:


> why not just use msi afterburner and set a permanent fan speed?


Same result. 65% makes it run at 80%.  Can't seem to figure out why.

After rebooting the PC though, the fan curve in AMD software started to work.

I'm thinking there may be a conflict going on between Afterburner and AMD software.


----------



## Felix123BU (Apr 11, 2021)

sepheronx said:


> Same result. 65% makes it run at 80%.  Can't seem to figure out why.
> 
> After rebooting the PC though, the fan curve in AMD software started to work.
> 
> I'm thinking there may be a conflict going on between Afterburner and AMD software.


Could be a conflict, for the 1.5 months I had the original cooler on mine, fan profiles in wattman worked flawlessly while switching mining and gaming profiles. I would get rid of all third party software if you don't really need it\them.


----------



## sepheronx (Apr 11, 2021)

Felix123BU said:


> Could be a conflict, for the 1.5 months I had the original cooler on mine, fan profiles in wattman worked flawlessly while switching mining and gaming profiles. I would get rid of all third party software if you don't really need it\them.



I removed MSI Afterburner.

How do I add the FPS counter in the metrics overlay?  I look it up but it seems to be on older software.

NVM, I am a dope.  It auto does it for you.


----------



## turbogear (Apr 12, 2021)

sepheronx said:


> I removed MSI Afterburner.
> 
> How do I add the FPS counter in the metrics overlay?  I look it up but it seems to be on older software.
> 
> NVM, I am a dope.  It auto does it for you.


You can use Rivatuner statistics server as standalone and it can show framerate during gaming.








						Guru3D RTSS Rivatuner Statistics Server Download 7.3.3 build 26004
					

Download RTSS Rivatuner Statistics Server This is the official homepage for Rivatuner. Initially designed as a small helper application for RivaTuner graphics card utility, RivaTuner Statistics Server became de-facto framerate mon...




					www.guru3d.com
				




I use that along with HWiNFO. You can log the FPS  from Rivatuner over HWiNFO if you want to generate data for example to calculate averages or make graphs.
With HWiNFO there is also possibility to make an overlay that displays FPS from Rivatuner as well as displaying other sensors like GPU power, Voltage, Temperatures, etc.

I also don't use Afterburner. 
I do all the Ocing over the Radeon Software.
The Fan curve can be adjusted over there and you can create separate profiles for clock and fan curve depending on your application need.


----------



## Felix123BU (Apr 12, 2021)

turbogear said:


> You can use Rivatuner statistics server as standalone and it can show framerate during gaming.
> 
> 
> 
> ...


Yea, I also use Rivatuner Statistics Server together with Hwinfo64, basically any sensor info Hwinfo64 has can be used for OSD with Rivatuner Statistics Server, but for simplish OSD the inbuilt Wattman stats are fine. Rivatuner does not interfere with anything, its just a tool to get the fanciest custom OSD possible.


----------



## sepheronx (Apr 12, 2021)

turbogear said:


> You can use Rivatuner statistics server as standalone and it can show framerate during gaming.
> 
> 
> 
> ...



Yeah, after uninstalling afterburner, the radeon software fan curve profile is now actually working.  Wont rev up to the point my PC will take off (although, it will make my flight to India cheaper), and also, the profile manager structure is working now!  So I can actually game, and then run the mining profile after i am done.  So now, I can actually play heavily demanding titles on my RX 6800XT such as Octopath Traveler, Text Based Adventures and more!

I'll install Rivatuner statistics server.  I got HWInfo as is.  This ill do on my upcoming time off.


----------



## Felix123BU (Apr 12, 2021)

sepheronx said:


> Yeah, after uninstalling afterburner, the radeon software fan curve profile is now actually working.  Wont rev up to the point my PC will take off (although, it will make my flight to India cheaper), and also, the profile manager structure is working now!  So I can actually game, and then run the mining profile after i am done.  So now, I can actually play heavily demanding titles on my RX 6800XT such as Octopath Traveler, Text Based Adventures and more!
> 
> I'll install Rivatuner statistics server.  I got HWInfo as is.  This ill do on my upcoming time off.


You also one of those who buys the latest and greatest GPU then proceeds to mostly play a lot of old games that don utilize 10% of the new hardware, like me?


----------



## sepheronx (Apr 12, 2021)

Felix123BU said:


> You also one of those who buys the latest and greatest GPU then proceeds to mostly play a lot of old games that don utilize 10% of the new hardware, like me?



no.................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................maybe


----------



## ChristTheGreat (Apr 12, 2021)

Question for 6800XT owners:

I am trying to tune my GPU, to run always a minimum of 1800mhz for gaming (don't want to see the GPU to 1300mhz, cause I cap frame rate). I play 1080P @ 125fps to keep power for recording, which doesn'T seems to work because that damn Warzone still use alot of GPU (even with low details xD)

even with 80-90% usage, if I set my minimum clock to 1800mhz, the voltage is 0.891, but I think that sometimes cause freeze or reboot. if I set the GPU to normal, it goes A1! Voltage is set to 1150 in the software

Anyone have seen this?


----------



## Felix123BU (Apr 12, 2021)

ChristTheGreat said:


> Question for 6800XT owners:
> 
> I am trying to tune my GPU, to run always a minimum of 1800mhz for gaming (don't want to see the GPU to 1300mhz, cause I cap frame rate). I play 1080P @ 125fps to keep power for recording, which doesn'T seems to work because that damn Warzone still use alot of GPU (even with low details xD)
> 
> ...


At first glance, the voltage you use for 1800mhz is way to low (0.891), try it with something like 1030mv, see how it goes, if its not stable, raise it a bit, if its stable, lower it a bit until you get stable settings.


----------



## ChristTheGreat (Apr 12, 2021)

Felix123BU said:


> At first glance, the voltage you use for 1800mhz is way to low (0.891), try it with something like 1030mv, see how it goes, if its not stable, raise it a bit, if its stable, lower it a bit until you get stable settings.


I use radeon tuner, set at 1150, but it wont change :/


----------



## Felix123BU (Apr 12, 2021)

ChristTheGreat said:


> I use radeon tuner, set at 1150, but it wont change :/


Not sure what radeon tuner is, I would just use wattman, something like the below. Also, somethin came to mind, do not set both minimal and maximal frequency at the same value, that will misfire, use a 100 Mhz difference between the 2, with the desired frequency being between the 2, see below:


----------



## ChristTheGreat (Apr 13, 2021)

Felix123BU said:


> Not sure what radeon tuner is, I would just use wattman, something like the below. Also, somethin came to mind, do not set both minimal and maximal frequency at the same value, that will misfire, use a 100 Mhz difference between the 2, with the desired frequency being between the 2, see below:
> 
> View attachment 196456


Looks like if I play and lock FPS,  the voltage will stay low. When I run 3dmark time spy, goes to 1100 no problem. Seems I can't force it to keep high voltage and high clock, even if the card is not push to the max.


----------



## Space Lynx (Apr 13, 2021)

I really have no idea how so many of you have problems. these have been my settings for two months now and 0 issues, so I don't know


----------



## sepheronx (Apr 13, 2021)

Had issues again with mining on this GPU.  It will auto ramp up speed of fans for a few seconds, then goes back to normal but then starts mining at 20 - 30 MHs from its initial 62MHs.  This card is f'd.

I cant figure it out.  So I am just gonna let it continue and monitor it.  May have to send it in for RMA.  I'll have to put it through rigorous gaming tests to see if there are any artifacts or anything screwing up the GPU causing it to act so wierd.  Could also be the drivers and a possible format of the system and trying again may fix it.

Edit: Tested Witcher 3. No issues there.

I am wondering if my memory OC was too high and it was causing it to fail at mining?  But usually when that happens, the miner will have a red lettering over the GPU meaning there is an issue.  Mine never gave that.  Actually, I would rarely have failed hashes either.


----------



## Felix123BU (Apr 13, 2021)

sepheronx said:


> Had issues again with mining on this GPU.  It will auto ramp up speed of fans for a few seconds, then goes back to normal but then starts mining at 20 - 30 MHs from its initial 62MHs.  This card is f'd.
> 
> I cant figure it out.  So I am just gonna let it continue and monitor it.  May have to send it in for RMA.  I'll have to put it through rigorous gaming tests to see if there are any artifacts or anything screwing up the GPU causing it to act so wierd.  Could also be the drivers and a possible format of the system and trying again may fix it.
> 
> ...


Wait, you are using those clocks for mining?   No, that's wrong if its the case, if you mine ETH, set the clock to 1200mhz Core Min, 1300 Core Max, Voltage 881 or the lowest you can set, and memory to a max of 2100 or 10-20mhz lower if its unstable. That gives you around 61-62 Mhs at 150w. I let my 6800XT mine ETH whenever I can, it stays at 61.5 Mhs all the time without a hitch. 
The fan auto ramp could be because it gets quite hot, using the settings above would help a lot with temps while mining.


----------



## sepheronx (Apr 13, 2021)

Felix123BU said:


> Wait, you are using those clocks for mining?   No, that's wrong if its the case, if you mine ETH, set the clock to 1200mhz Core Min, 1300 Core Max, Voltage 881 or the lowest you can set, and memory to a max of 2100 or 10-20mhz lower if its unstable. That gives you around 61-62 Mhs at 150w. I let my 6800XT mine ETH whenever I can, it stays at 61.5 Mhs all the time without a hitch.
> The fan auto ramp could be because it gets quite hot, using the settings above would help a lot with temps while mining.



what?

No.

I had it 500 - 1500 core, lowest for voltage and mem at 2150.  Also set it to fast timings on vram

I now switched it to 2100 for memory.  It may have been the issue.


----------



## Felix123BU (Apr 13, 2021)

sepheronx said:


> what?
> 
> No.
> 
> ...


Most likely, if mem is not stable it will drop the hash a lot over time, try to find a mem speed that remains stable, if it still drops over time, most likely the mem is not stable at a set speed. At 2150 and fast timing, mem operations will crap out giving low hash speed. From some forums I gathered info, 2100 is sort of the max with fast timings (depending on how good the mem chips are), and the difference between fast and normal timings is negligible. With normal timings you might get a few extra mem Mhz stable over 2100, but its not worth it really.


----------



## sepheronx (Apr 13, 2021)

I am getting to around the same amount that I got before anyway at current 2100.  I may lower it to just default timings for optimal stability then.


----------



## turbogear (Apr 13, 2021)

sepheronx said:


> I am getting to around the same amount that I got before anyway at current 2100.  I may lower it to just default timings for optimal stability then.


It could very well be your memory is unstable. You should go in smaller steps towards 2150MHz to see what is highest without performance drop.
My experience with multiple RDNA2  cards shows that there are very few cards that can operate much higher than 2100MHz on memory.
Increasing it further than stable point drops the performance.


----------



## Space Lynx (Apr 13, 2021)

turbogear said:


> It could very well be your memory is unstable. You should go in smaller steps towards 2150MHz to see what is highest without performance drop.
> My experience with multiple RDNA2  cards shows that there are very few cards that can operate much higher than 2100MHz on memory.
> Increasing it further than stable point drops the performance.



I get errors in game if I oc my vram at all. i can oc the gpu core hardcore with 0 issues though and that's all that really matters for fps increases in gaming


----------



## sepheronx (Apr 13, 2021)

turbogear said:


> It could very well be your memory is unstable. You should go in smaller steps towards 2150MHz to see what is highest without performance drop.
> My experience with multiple RDNA2  cards shows that there are very few cards that can operate much higher than 2100MHz on memory.
> Increasing it further than stable point drops the performance.


Yeah, I figure this is the case. It's at 2100 and so far so good.

It runs fine in games now as well.  I really got to upgrade cooling for this card. I refraine from watercooling as it's a messy expense. But I wonder, are there any AIO's planned for these GPUs besides the specialized models of GPUs?


----------



## Garlic (Apr 14, 2021)

Upgraded from a 5700xt to this, so far performs great and junction temp never go above 65 when gaming.  also surprised how far this can OC, 2920-2930 is when it starts to crash but 2910 is very stable.



Garlic said:


> Upgraded from a 5700xt to this, so far performs great and junction temp never go above 65 when gaming.  also surprised how far this can OC, 2920-2930 is when it starts to crash but 2910 is very stable.


Meant to say junction temp never goes higher than 75 when gaming


----------



## turbogear (Apr 14, 2021)

Garlic said:


> Upgraded from a 5700xt to this, so far performs great and junction temp never go above 65 when gaming.  also surprised how far this can OC, 2920-2930 is when it starts to crash but 2910 is very stable.
> 
> 
> Meant to say junction temp never goes higher than 75 when gaming


Welcome to the club.


----------



## GamerGuy (Apr 14, 2021)

Garlic said:


> Upgraded from a 5700xt to this, so far performs great and junction temp never go above 65 when gaming.  also surprised how far this can OC, 2920-2930 is when it starts to crash but 2910 is very stable.
> 
> 
> Meant to say junction temp never goes higher than 75 when gaming


Awesome, welcome to the club!!!


----------



## basco (Apr 14, 2021)

Mr. Garlic welcome and would ya plz upload a screenshot of gpu-z and is this stable in benchmarks?
the device id ?
thanks i am just interested because that would be an enormous overclock.


----------



## Garlic (Apr 14, 2021)

basco said:


> Mr. Garlic welcome and would ya plz upload a screenshot of gpu-z and is this stable in benchmarks?
> the device id ?
> thanks i am just interested because that would be an enormous overclock.


Here, in games it sits around 2800-2830 and stable, but as soon as i go over 2915 it crashes. In furmark its quite a bit lower for some reason, prob because draws a lot more current than most things.


----------



## Raendor (Apr 14, 2021)

May I now enter the club too? 




received my upgrade yesterday and spent some time in the evening to re-assemble the rig, since I put a new mobo and cpu too.

now to find out what makes sense to do in terms of clocks and voltage. Much more heat coming out of it compared to old 1080.


----------



## turbogear (Apr 14, 2021)

Raendor said:


> May I now enter the club too? View attachment 196703View attachment 196704
> received my upgrade yesterday and spent some time in the evening to re-assemble the rig, since I put a new mobo and cpu too.
> 
> now to find out what makes sense to do in terms of clocks and voltage. Much more heat coming out of it compared to old 1080.


Wow that's really compact.
I hope you would be able to cool it well in this small case.


----------



## Raendor (Apr 14, 2021)

turbogear said:


> Wow that's really compact.
> I hope you would be able to cool it well in this small case.


Ran RDR2 a bit and when it’s at 254w mark (guess it’s tdp limit as I haven’t seen it higher) it was no higher than 78 C at default fan curve. Didn’t tweak anything yet as I’ve got no time. So it’ll be a slow and gradual process. Case is fine. Mesh everywhere and 2 120 intake fans under gpu and 120 exhaust on side panel doing well. But I’d definitely undervolt before summertime.


----------



## Felix123BU (Apr 14, 2021)

Raendor said:


> May I now enter the club too? View attachment 196703View attachment 196704
> received my upgrade yesterday and spent some time in the evening to re-assemble the rig, since I put a new mobo and cpu too.
> 
> now to find out what makes sense to do in terms of clocks and voltage. Much more heat coming out of it compared to old 1080.


"May I now enter the club too?" - Absolutely, 1000%, welcome mate


----------



## Garlic (Apr 14, 2021)

Raendor said:


> Ran RDR2 a bit and when it’s at 254w mark (guess it’s tdp limit as I haven’t seen it higher) it was no higher than 78 C at default fan curve. Didn’t tweak anything yet as I’ve got no time. So it’ll be a slow and gradual process. Case is fine. Mesh everywhere and 2 120 intake fans under gpu and 120 exhaust on side panel doing well. But I’d definitely undervolt before summertime.


78 Junction temp? if so thats really good, if not you might want to look at the junction temp since its almost always 10-20C hotter


----------



## Raendor (Apr 15, 2021)

Garlic said:


> 78 Junction temp? if so thats really good, if not you might want to look at the junction temp since its almost always 10-20C hotter


Not sure now which temp I was looking at, but I guess general one rather than junction. Will take a closer look, but some undervolt won’t hurt for sure.




Felix123BU said:


> "May I now enter the club too?" - Absolutely, 1000%, welcome mate



Honestly I‘m wondering if I should keep 6800 xt or just put 6800 in my system instead (which I ordered earlier, but is coming later lol). Of course I like more power and looks of midnight black but 6800 is a bit smaller and less power-hungr. On the other hand having 12 more CUs is great and nothing stops me from tweaking the card closer to 6800 consumption levels, just need to study a bit as I haven’t used radeon since 2004 
my only real complaint is coil whine, which apparently is quite often forthcoming on these reference models.

How’s it for you guys? My psu is sf750 and less than year old, so I don’t think that has anything to do with it.


----------



## GamerGuy (Apr 15, 2021)

Garlic said:


> 78 Junction temp? if so thats really good, if not you might want to look at the junction temp since its almost always 10-20C hotter


Wow, 78C is pretty awesome, my Nitro+ RX 6900 XT Junction Temp hangs around 83-85C, and the GPU temp itself is around mid 70, so their temp delta between them is about 20C.

To all those guys who've recently gotten their RX 6000 cards....WELCOME!!!


----------



## Garlic (Apr 15, 2021)

After playing around with wattman and games for a few hours I was able to push the gpu a bit more while still having junction temp under 80C, this actually looks like a pretty good silicon. BUT I noticed that despite me taking the voltage down to 1125 it still goes to 1200 for some reason, if anyone know why this is the case please let me know


----------



## Felix123BU (Apr 15, 2021)

Garlic said:


> After playing around with wattman and games for a few hours I was able to push the gpu a bit more while still having junction temp under 80C, this actually looks like a pretty good silicon. BUT I noticed that despite me taking the voltage down to 1125 it still goes to 1200 for some reason, if anyone know why this is the case please let me know


From what I noticed on mine, and some other 6000 series, the voltage you  put in is more of a guidance, the GPU seems to use whatever voltage it deems needed to keep running stable, while at least trying to keep it close to what you put in. On mine, the higher the clocks, the more it will go over what voltage I set.


----------



## BarbaricSoul (Apr 15, 2021)

I currently hold the highest Time Spy score for a i9 10850k/RX6700XT system-
https://www.3dmark.com/spy/19710279 








						3DMark.com search
					

3DMark.com search




					www.3dmark.com


----------



## Garlic (Apr 15, 2021)

BarbaricSoul said:


> I currently hold the highest Time Spy score for a i9 10850k/RX6700XT system-
> https://www.3dmark.com/spy/19710279
> 
> 
> ...


Nice, I have the same setup with a 10850k and a 6700xt, might try to break that record lol I have my i9 clocked at 4.9 and gpu at 2910, and is your setup overclocked? If so by how much, thanks.


----------



## BarbaricSoul (Apr 16, 2021)

CPU at 5GHz all cores and GPU OC'ed in Afterburner to 2.9GHz but only goes to like 2.7


----------



## Garlic (Apr 16, 2021)

My gpu goes to about 2.83 in sustained loads


----------



## BarbaricSoul (Apr 16, 2021)

just did a little work with Afterburner, mine is doing 2.8 also.



Garlic said:


> might try to break that record



Just to make it interesting CPU at 5.2 GHz and GPU at 2.8 GHz, new high score, 12729









						I scored 12 729 in Time Spy
					

Intel Core i9-10850K Processor, AMD Radeon RX 6700 XT x 1, 16384 MB, 64-bit Windows 10}




					www.3dmark.com


----------



## Space Lynx (Apr 16, 2021)

BarbaricSoul said:


> just did a little work with Afterburner, mine is doing 2.8 also.
> 
> 
> 
> ...



stable in benchmarks is not same thing as stable in games. run WoW, Shadow of the Tomb Raider, Witcher 3, or any demanding game AAA title on max settings for around 2 hours with no crashes. if you can do that then you are stable. I can run benchmarks like time spy all day stable with extreme oc's. but not in-game


----------



## BarbaricSoul (Apr 16, 2021)

I played about 4 hours of COD last night, no crashes, granted, that was with the GPU at 2.7 GHz, not 2.8, but I'd bet I'd get the same results at 2.8.



lynx29 said:


> stable in benchmarks is not same thing as stable in games. run WoW, Shadow of the Tomb Raider, Witcher 3, or any demanding game AAA title on max settings for around 2 hours with no crashes. if you can do that then you are stable. I can run benchmarks like time spy all day stable with extreme oc's. but not in-game


----------



## Space Lynx (Apr 16, 2021)

BarbaricSoul said:


> I played about 4 hours of COD last night, no crashes, granted, that was with the GPU at 2.7 GHz, not 2.8, but I'd bet I'd get the same results at 2.8.



60-70% chance your ok then, you still need to play some more AAA titles, one isn't enough for determining it. just my two cents anyway. you probably are ok though yeah.


----------



## kapone32 (Apr 16, 2021)

lynx29 said:


> 60-70% chance your ok then, you still need to play some more AAA titles, one isn't enough for determining it. just my two cents anyway. you probably are ok though yeah.


Did they release something in the last driver? My card used to be nice and cool at 65 C on the junction at 2.4 GHZ but it now seems to want to boost as high as it can (I saw 2.8 in HWinfo64). I was so pissed I took the card apart to see if there were any issues. That resulted in a 3 degree drop in idle temps but it has me so upset that I actually ordered a water block yesterday for the card.


----------



## Space Lynx (Apr 16, 2021)

kapone32 said:


> Did they release something in the last driver? My card used to be nice and cool at 65 C on the junction at 2.4 GHZ but it now seems to want to boost as high as it can (I saw 2.8 in HWinfo64). I was so pissed I took the card apart to see if there were any issues. That resulted in a 3 degree drop in idle temps but it has me so upset that I actually ordered a water block yesterday for the card.



I have always used a custom aggressive fan curve, so my temps are always low, so I can't answer that question. I'm honestly jelly of both of you, the 6800 non-xt maxes out at 2600 - and mine only does 2550 stable - 99% of time regardless of temp it downclocks to 2.4 though. i don't mess with wattman though... so maybe I could get more out of it? do you use wattman?


----------



## Felix123BU (Apr 16, 2021)

kapone32 said:


> Did they release something in the last driver? My card used to be nice and cool at 65 C on the junction at 2.4 GHZ but it now seems to want to boost as high as it can (I saw 2.8 in HWinfo64). I was so pissed I took the card apart to see if there were any issues. That resulted in a 3 degree drop in idle temps but it has me so upset that I actually ordered a water block yesterday for the card.


What driver is that that pushes it to its max? I waaaant it!


----------



## kapone32 (Apr 16, 2021)

lynx29 said:


> I have always used a custom aggressive fan curve, so my temps are always low, so I can't answer that question. I'm honestly jelly of both of you, the 6800 non-xt maxes out at 2600 - and mine only does 2550 stable - 99% of time regardless of temp it downclocks to 2.4 though. i don't mess with wattman though... so maybe I could get more out of it? do you use wattman?


Yes I do use Wattman and I have to reset my profile regularly. There was no issue until the latest driver was installed. I am really interested to see what happens when I install the water block.


----------



## BarbaricSoul (Apr 16, 2021)

I’m running the newest driver with my Sapphire Nitro. I’m also running a fan curve of 1% per every 1 degree of temperature. Running Time Spy, clocking 2.78-2.83, my fans run at 51% maintaining a temp of 51’


----------



## turbogear (Apr 16, 2021)

BarbaricSoul said:


> I’m running the newest driver with my Sapphire Nitro. I’m also running a fan curve of 1% per every 1 degree of temperature. Running Time Spy, clocking 2.78-2.83, my fans run at 51% maintaining a temp of 51’


Damn these 6700XT cards can really clock high.


----------



## Felix123BU (Apr 16, 2021)

turbogear said:


> Damn these 6700XT cards can really clock high.


Yeah, imagine the 6800XT and 6900XT clocking that high


----------



## kapone32 (Apr 16, 2021)

Felix123BU said:


> What driver is that that pushes it to its max? I waaaant it!


It has to be the latest Adrenlin update.


----------



## Felix123BU (Apr 16, 2021)

kapone32 said:


> It has to be the latest Adrenlin update.


Might try it, though I run the latest WHQL which brought nice performance improvements across the board, so not sure if I want to jinx it, but curiosity always kills the cat


----------



## turbogear (Apr 16, 2021)

Felix123BU said:


> Yeah, imagine the 6800XT and 6900XT clocking that high


Maybe one of these 6900XT Liquid Devil Ultimate could fulfil your wish:


----------



## Felix123BU (Apr 16, 2021)

turbogear said:


> Maybe one of these 6900XT Liquid Devil Ultimate could fulfil your wish:


Could be, if AMD in their great wisdom do not decide to cock-block them, pardon me, clock-block them at 2.8


----------



## turbogear (Apr 16, 2021)

Felix123BU said:


> Could be, if AMD in their great wisdom do not decide to cock-block them, pardon me, clock-block them at 2.8


This one has limit set at 4000MHz and voltage can go to 1.2V. 
It is one of these new binned chips were AMD allowed to increased GPU clock limit and also memory clock limit.
Bios power limit by default at 332W. 
So I think with 15% additional from Radeon Software it has power limit of 382W.
The price at Mindfactory is scary at 2349€.
I assume one pays premium for unclocked card.  
But there is no promise that you can reach 3GHz as above video shows.

JayzTwoCebts does not look like a profi overclocker like us though.
He is funny guy. 

More info to these monsters:





						POWERCOLOR REVEALS THE RED DEVIL AND LIQUID DEVIL AMD RADEON™ RX 6900 XT ULTIMATE - PowerColor
					






					www.powercolor.com


----------



## Garlic (Apr 16, 2021)

turbogear said:


> This one has limit set at 4000MHz and voltage can go to 1.2V.
> It is one of these new binned chips were AMD allowed to increased GPU clock limit and also memory clock limit.
> Bios power limit by default at 332W.
> So I think with 15% additional from Radeon Software it has power limit of 382W.
> ...


I mean it all depend on luck, if your really unlucky you might get a chip that can't oc at all( which is very unlikely to happen)

While playing games last night, I noticed that my pc can play anything stable as long as its not made by EA, if it is the pc will crash. I've tired other 3a titles and none of them crash lmao.


----------



## Felix123BU (Apr 16, 2021)

turbogear said:


> This one has limit set at 4000MHz and voltage can go to 1.2V.
> It is one of these new binned chips were AMD allowed to increased GPU clock limit and also memory clock limit.
> Bios power limit by default at 332W.
> So I think with 15% additional from Radeon Software it has power limit of 382W.
> ...


What is funny that my puny MSRP 6800XT OC-ed beats the time spy scores of that 2300 Euro mega hyper liquid shmiquid Devil at stock with a bucket of ice cooling it


----------



## Raendor (Apr 17, 2021)

Do any of you guys play F4 by any chance? My fps is locked there either to 45 or 55 fps. Guess it has something to do with strange internal engine vsync, but on my gtx 1080 when dictated by application it was always at 60 as intended. Maybe need some specific different settings?


----------



## xenosys (Apr 17, 2021)

Felix123BU said:


> What is funny that my puny MSRP 6800XT OC-ed beats the time spy scores of that 2300 Euro mega hyper liquid shmiquid Devil at stock with a bucket of ice cooling it



You're correct, although JayZ2Cents isn't very proficient with overclocking AMD cards at all.  I'm sure the equivalent of a 6900XT Golden Sample under ice water with a higher voltage and power limit would score better than 20900 in Timespy.  With MPT to up the wattage by 60-70 watts, and some tweaking in Wattman, you can probably score upwards of 22-23k on that card.


----------



## Felix123BU (Apr 17, 2021)

xenosys said:


> You're correct, although JayZ2Cents isn't very proficient with overclocking AMD cards at all.  I'm sure the equivalent of a 6900XT Golden Sample under ice water with a higher voltage and power limit would score better than 20900 in Timespy.  With MPT to up the wattage by 60-70 watts, and some tweaking in Wattman, you can probably score upwards of 22-23k on that card.


Indeed, I was basically screaming the whole video "MPT, MPT, MPT", there is one universal thing that makes the 6000 series quite a lot faster, MPT and moar watts  


Core voltage control would be brilliant too, assuming it will ever happen.


----------



## Space Lynx (Apr 17, 2021)

Felix123BU said:


> Indeed, I was basically screaming the whole video "MPT, MPT, MPT", there is one universal thing that makes the 6000 series quite a lot faster, MPT and moar watts
> 
> 
> Core voltage control would be brilliant too, assuming it will ever happen.



wth is mpt, i have had a 6800 since launch and no idea what y'all are on about lol


----------



## Felix123BU (Apr 17, 2021)

lynx29 said:


> wth is mpt, i have had a 6800 since launch and no idea what y'all are on about lol


More Power Tool, igors lab, allows you to up the power limits on these cards over their maximum, giving a nice performance boost, at the cost of more heat and ofc more power


----------



## Space Lynx (Apr 17, 2021)

Felix123BU said:


> More Power Tool, igors lab, allows you to up the power limits on these cards over their maximum, giving a nice performance boost, at the cost of more heat and ofc more power



i just looked into it, yeah im not willing to do a biosflash to use it. if i was under custom water i'd give it a go, but stock cooler even with strong fan curve, i don't want to push it more than it already goes on the junction temp


----------



## Felix123BU (Apr 17, 2021)

lynx29 said:


> i just looked into it, yeah im not willing to do a biosflash to use it. if i was under custom water i'd give it a go, but stock cooler even with strong fan curve, i don't want to push it more than it already goes on the junction temp


You do not need to do a biosflash to use it, just to extract the bios, load it into MPT, modify power setting, write to the windows soft power play table and you are good to go for increased power (basically some registry tweks). You do not need to flash a modified bios to the card 

And yeah, it depends on how good your cooling is, if your JT is already very high, then its not an option, you need very good cooling to raise power "safely"


----------



## Raendor (Apr 18, 2021)

Finally after hours of different issues I’m starting to find the clock and voltage settings that allow me to run RDR2 (my test game) at below 220w (more often around 200w) mark. Had to set 2250-2350MHz range and 1075 voltage. It’s pretty stupid that these settings don’t really mean much in relationship with ingame numbers as my clock hovers at around 2300 mark and voltage at 1000mv. When set 1050 seems unstable and 1100 is already too much wattage to my liking. Seems I might try to test in between or just settle and not bother. At default fan curve i had junction in mid-high 80s, but with custom fan up to 60% junction sits at mid 70s and overall in 60s at about 50% rpm. Might lower the curve too for some middle ground. Overall, it’s ok, but too much time consuming.


----------



## turbogear (Apr 19, 2021)

I own now 6900XT Lquid Devil Ultimate since Saturday.  
I bought it as birthday gift for myself as the replacement for my damaged 6800XT. 




The 6800XT which got damaged with ESD was sent few days ago to the shop for repair or replacement.
Let's hope for the best. 
I am not sure if it would be repaired at all as I had original cooler removed before and replaced with water cooler.
I put the card back in original form, but I am sure they will recognize that the warranty stickers were removed one time and placed again.
So I don't know if I would get a replacement for that or not. If I get the replacement for that then I will sell that on eBay. 


The 6900XT Liquid Devil Ultimate is completely another beast. It is special binned 6900XT to achieve higher performance.
AXRX 6900XTU 16GBD6-W2DHC/OC - PowerColor

It has two Bios file; the OC and the Unleash.
The second "Uleash bios" has default power setting of 332W. If you put 15% from Radeon Software slider you have then 382W headroom. 

This card is a monster of performance I can say already with the first short test session.
Already applying the Auto Tuning GPU overclock gave a good feeling that I have got a very capable card on my hands.
I had multiple 6800XT and 6900XTs in my hands but this is the highest ever auto OC frequency. 






I had very little time on weekend to play around with it due to my birthday  and with just a little tuning only on the Core frequency, I was able to get it stable running at 2750MHz@1175mV.
Fire Strike Stress test passed with score of 98.8% with these settings. I am sure a little more tuning of voltage will bring the stability score in range of 99.5%.

This is the Time Spy score.
I have not at all yet tuned VRAM and is set default.
I believe the score will go in direction of 22K or higher ones I tune the VRAM frequency. 
I did not also try higher GPU clock. Maybe something like 2800MHz would also be possible.



I ran Assassin's Creed Valhalla benchmark and this is already 11% faster than my 6800XT which was overclocked to 2750MHz@1020mV and gave 123 FPS on average and here I have 137 FPS. 



Here are the settings that I used:



For the air-cooled 6900XT Ultimate, one need to apply some more power with MPT as default with OC bios is set by PowerColor to only 303W. 
For the Liquid 6900XT Ultimate it is as mentioned above 332W for Unleash bios.
I had chat with W1zzard on Sunday and he provided me with a new beta GPU-Z which now supports these new cards. Thanks @W1zzard  for the help. You are really great. 


So I was able to save the bios to load it into MPT. I also uploaded both Bioses to the TPU database.
Note: One needs MPT version 1.3.4. The older version 1.3.3 does not support these new cards.


I have not changed anything with MPT so far, but only looked at default settings for the Unleash bios.
So you can see below that 332W is default, i.e. 382W total power with 15% slider.
There is still performance on the table. Looking at the TDC, it is set to 320A by default.
It can be pushed further in direction of 382A to match with Core power. 

The maximum GFX voltage is 1200mV for this card compared to 1175mV for regular 6900XT.


The frequency slider here can go up to 4GHz.
I am pretty sure, I will never be able to reach that frequency. 



I am hesitating at the moment to push the card to it limits.
I have suspicion that the ESD shock that killed my 6800XT could also have affected the PSU.

The new card ran perfectly fine with these really high power setting on the same PSU, but I don't have a piece of mind and don't want to risk this high end card.
I am looking for a new 1200W supply. HX1200i is the one I own and would love to get same again but really difficult to get any Platinum rated 1200W supplies at the moment. 

PowerColor recommends on their website a minimum of 900W power supply for this card.


----------



## xenosys (Apr 19, 2021)

turbogear said:


> I own now 6900XT Lquid Devil Ultimate since Saturday.
> I bought it as birthday gift for myself as the replacement for my damaged 6800XT.
> 
> View attachment 197423
> ...



Looks great! 

Nice to see someone post some more information regarding this card.  It's like finding a needle in a haystack at the moment.  Would be interested to see what sort of scores you could post with another 70-80 watts of power given it's a 525w card, with some tweaking in Wattman to Memory, Voltages and Frequencies.  I can easily see this card hitting 22.5-23k with a stable 2850-2900Mhz on the core clock in 3D Mark.


----------



## turbogear (Apr 19, 2021)

xenosys said:


> Looks great!
> 
> Nice to see someone post some more information regarding this card.  It's like finding a needle in a haystack at the moment.  Would be interested to see what sort of scores you could post with another 70-80 watts of power given it's a 525w card, with some tweaking in Wattman to Memory, Voltages and Frequencies.  I can easily see this card hitting 22.5-23k with a stable 2900Mhz on the core clock in 3D Mark.


Yes, there seems to be still a lot of potential, but as I said I am at the moment hesitant to push it very high before I can get another 1200W Platinum class PSU.
The reason why my 6800XT died was ESD shock onto the card. It could be that PSU also got some of the hit.
So I am careful at the moment before I get a new supply. I don't want to risk this nice expensive piece of HW.
I paid 2349€ for this 6900XT Performance monster.  

Ones I have a new supply I will RMA my Corsair HX1200i so that it can be checked by them.
It has 7 years warranty and I have it since 3 years.

There are not many review out there and with some that I found the reviewers seems to not know how to handle RDNA2 card when it comes to OCing and they were not getting any performance out of it.
Maybe I am supper lucky and their samples were not so good or they just don't know how to properly tune these.  

Some reviewers had the air-cooled Ultimate which is limited to 303W and that could be reason for them not getting good performance.
One of the reviewer wrote that for his LC testing he got new 332W bios from PowerColor.
So basically maybe PowerColor sent him bios from the Liquid Devil Ultimate to flash it onto his air-cooled Ultimate. 
They sent him new version of flashing tool amdvbflashWin v3.2.
I have the amdvbflashWin v3.15 but that does not support the Ultimate cards.

So theoretically maybe it is possible to flash 6900XT Liquid Devil Bios onto air-cooled 6900XT Ultimate, but one does not need to flash a bios to get higher power targets.
That is easily done with MPT.


----------



## Space Lynx (Apr 19, 2021)

Seeing as how the world of silicon is now dead due to mining until at least 2023... I may consider going full custom water... always wanted to... could get an EK block for my 6800 and try this wattman stuff out.  its safe to OC heavy on water 24/7? as long as temps are ok? i care about longevity of my gpu is main reason i don't bother.


----------



## turbogear (Apr 19, 2021)

lynx29 said:


> Seeing as how the world of silicon is now dead due to mining until at least 2023... I may consider going full custom water... always wanted to... could get an EK block for my 6800 and try this wattman stuff out.  its safe to OC heavy on water 24/7? as long as temps are ok? i care about longevity of my gpu is main reason i don't bother.


Putting a water block onto these cards is easy.
The temperatures would be much better.
The performance depends on the silicon lottery.
If you are lucky and have a great card then it could be pushed more if not then water cooling does not help. 

For the 6800 you should not push the power with MorePowerTool as far as for example like 6800XT or 6900XT.
If you have reference card then the VRM is not as powerful like the 6800XT reference. You card has I think less phases.
For non-reference cards most probably you will find some with more powerful VRMs.


----------



## GamerGuy (Apr 20, 2021)

@turbogear - I have the excellent Nitro+ RX 6900 XT and I find myself actually jealous of you with that awesome RX 6900 XT LDU!

Regardless, 'grats on scoring that monstrous card!


----------



## turbogear (Apr 20, 2021)

GamerGuy said:


> @turbogear - I have the excellent Nitro+ RX 6900 XT and I find myself actually jealous of you with that awesome RX 6900 XT LDU!
> 
> Regardless, 'grats on scoring that monstrous card!


Thanks a lot.  
If the 6800XT did not fail, I would have not given any chance anymore to another 6900XT.
I had tested two 6900XT before, the Red Devil and Sapphire Toxic and both were not worth the money as my 6800XT OCed to within 300-400 points from both of them, but these new XTXH silicon cards seem to in their own league.


----------



## ratirt (Apr 20, 2021)

turbogear said:


> I own now 6900XT Lquid Devil Ultimate since Saturday.
> I bought it as birthday gift for myself as the replacement for my damaged 6800XT.
> 
> View attachment 197423
> ...


I wonder if I could put Bios from your card on my Red Devil. I could get some stuff unlocked though.
There should not be any difference between the Ultimate and the regular Red Devil.

BTW, what TJmax you get with the liquid cooler?


----------



## turbogear (Apr 20, 2021)

ratirt said:


> I wonder if I could put Bios from your card on my Red Devil. I could get some stuff unlocked though.
> There should not be any difference between the Ultimate and the regular Red Devil.
> 
> BTW, what TJmax you get with the liquid cooler?


I don't think that you can flash the bios just like that without some modification. The new cards have a different ID, so I don't know if flashing the bios onto regular 6900XT could brick it.
These new cards are based on high binned XTXH silicon. In theory if you were lucky with your regular 6900XT and you have one of low leakage chips, it could also hit higher frequencies. 

I suppose in future luck to get a regular high binned 6900XT Red Devils will become rare as most probably the best chips will be sorted out by AMD for the XTXH variants that are going to be launched soon by multiple partners. 

I don't think you need new bios except if you want to push voltage towards 1200mV.
Did you try to OC your Red Devil?
If you are able to push it towards 2.8GHz with more power through MPT then yes it could make sense to unlock the higher frequencies otherwise it would not bring you any benefit. 

At 2750MHz setting my card is far from 1200mV. The voltage goes in range of 1115mV when frequency peaks at 2700MHz. So mine does not need that high voltage. 
That is why my first setting was 2750MHz@1175mV and it works stable without issues.
I did not try for example 2800MHz or higher. In that case maybe voltage will be in direction of 1175mV. 
As I mentioned above, I am careful at the moment not to try too hard until I don't have a new power supply. Mine could have some damage due to ESD shock that killed my 6800XT.
I don't want to risk my expensive card at the moment.  

For the TJmax, I saw max 84°C when the card was overclocked to 2750MHz@1175mV with 382W Core power.


----------



## ratirt (Apr 20, 2021)

turbogear said:


> For the TJmax, I saw max 84°C when the card was overclocked to 2750MHz@1175mV with 382W Core power.


Nice. My card sometimes hit the 90+ range when loaded to the roof. I wanna get more fans in my chassis and bigger for that matter. I think better airflow will make some difference. 
I still haven't tried the OC and currently I got the card undervolted to 1.1v @ 2400Mhz mem @2100Mhz seems OK.


----------



## Felix123BU (Apr 20, 2021)

turbogear said:


> I don't think that you can flash the bios just like that without some modification. The new cards have a different ID, so I don't know if flashing the bios onto regular 6900XT could brick it.
> These new cards are based on high binned XTXH silicon. In theory if you were lucky with your regular 6900XT and you have one of low leakage chips, it could also hit higher frequencies.
> 
> I suppose in future luck to get a regular high binned 6900XT Red Devils will become rare as most probably the best chips will be sorted out by AMD for the XTXH variants that are going to be launched soon by multiple partners.
> ...


One can use the -f argument to flash the bios, -f being "obey card, NOW!" and poof, if the bios is not super weird, you can flash an Asus bios to a Sapphire card and vice versa, but for these card being so elusive, its a small risk, and I say small because its basically impossible to fatally brick a card if you are not a complete noob


----------



## turbogear (Apr 20, 2021)

Felix123BU said:


> One can use the -f argument to flash the bios, -f being "obey card, NOW!" and poof, if the bios is not super weird, you can flash an Asus bios to a Sapphire card and vice versa, but for these card being so elusive, its a small risk, and I say small because its basically impossible to fatally brick a card if you are not a complete noob


So TPU seems to have now released a flashing tool for those daring ones who want to experiment with flashing around bioses from other cards.   

AMD VBFlash / ATI ATIFlash 3.15​








						AMDVBFlash / ATI ATIFlash (3.31) Download
					

AMD AMDVBFlash is used to flash the graphics card BIOS. The version released by ATI was called ATIFlash or just WinFlash.   It supports all AMD Radeo




					www.techpowerup.com
				




3.15 (April 19th, 2021)​
Adds support for AMD Radeon RX 6700 XT, RX 6800, RX 6800 XT, RX 6900 XT
Adds AMDVBFlashDriverInstaller.exe, written by us at TPU, which lets you easily install/uninstall the AMD driver that's now required to execute flashing


----------



## Outback Bronze (Apr 20, 2021)

turbogear said:


> 6800XT OCed



Seems like the best value card no?


----------



## Weleleh (Apr 20, 2021)

I have a Sapphire 6900XT Toxic LE, I have the Ultimate bios and the new AMDVBFLASH however there's no -f argument so can't force flash and end up getting mismatch ID.
Any idea?
My card has dual bios so not worried.


----------



## turbogear (Apr 20, 2021)

Weleleh said:


> I have a Sapphire 6900XT Toxic LE, I have the Ultimate bios and the new AMDVBFLASH however there's no -f argument so can't force flash and end up getting mismatch ID.
> Any idea?
> My card has dual bios so not worried.


That is what I was saying some posts back.
The Ultimate cards are having a new silicon version XTXH released from AMD with a new ID. 
Sorry no idea how to force flash.
Maybe try to ask the creators of the tool at Techpowerup.


----------



## Weleleh (Apr 20, 2021)

It's possible to flash via Linux apparently.
Having mismatche ID's doesn't mean anything we could always force flash. Also die stepping happened on RDNA too and it worked fine.
The only reason I see this not working on RDNA2 is due to some driver level checks done on Windows apparently.


----------



## turbogear (Apr 20, 2021)

Outback Bronze said:


> Seems like the best value card no?


Yes if you get a good sample that can OC very well.   
This depends on silicon lottery.


----------



## ChristTheGreat (Apr 20, 2021)

anyone knows how could I force constant voltage, so I can keep high GPU frequency, event if the game doesn't use the GPU at max? I guess by bios flashing with higher value?

Thing is, that I want to hav 2 profiles: one normal, the other one for gaming :/


----------



## turbogear (Apr 20, 2021)

ChristTheGreat said:


> anyone knows how could I force constant voltage, so I can keep high GPU frequency, event if the game doesn't use the GPU at max? I guess by bios flashing with higher value?
> 
> Thing is, that I want to hav 2 profiles: one normal, the other one for gaming :/


I don't think there is a way to fix the voltage.
From my experimenting with multiple 6900XT and 6800XT cards, they don't hold voltage at one value all time but it is varying as GPU needs it based on the boost algorithms.
The slider in Radeon SW seems to be like kind of reference but card does whatever it wants.   
For example in my 6800XT I had the slider at 1020mV for 2750MHz but GPU was jumping between 800mv up to 1110mV.

Editing bios to change different parameters is not yet possible for RDNA2 but even then I am not sure if fixed voltage for fixed frequency will be possible except if one can somehow disable the boost algorithms.


----------



## ChristTheGreat (Apr 20, 2021)

Thanks for the info. Mine runs most of the time at 875mv, so If I set minimum GPU to 1800, 875mv is not enought :/. I just want to keep high frames steady. Right now, I still play on 1080P, but I guess upgrading to 1440p could help alot to keep high GPU frequency most of the time.

Note: don't ask why I play on 1080P, I want to keep the card for a long time, with high frames  hahaha


----------



## turbogear (Apr 20, 2021)

ChristTheGreat said:


> Thanks for the info. Mine runs most of the time at 875mv, so If I set minimum GPU to 1800, 875mv is not enought :/. I just want to keep high frames steady. Right now, I still play on 1080P, but I guess upgrading to 1440p could help alot to keep high GPU frequency most of the time.
> 
> Note: don't ask why I play on 1080P, I want to keep the card for a long time, with high frames  hahaha


Setting the minimum and maximum frequency slider with 100MHz offset to each other can help to keep the frequency within a certain window.


----------



## Raendor (Apr 20, 2021)

turbogear said:


> Setting the minimum and maximum frequency slider with 100MHz offset to each other can help to keep the frequency within a certain window.


Seems like the voltage is more dependent on Min/Max GHz range you set, rather than voltage slider. Wonder if that’s changed anyhow with new drivers. Didn’t test yet.


----------



## Weleleh (Apr 20, 2021)

So I managed to flash ultimate bios on my toxic in Linux. 
It works but there are some issues... 

First of all driver still knows you dont have a xtxh stepping so if you go past 3000 or 2150 you get driver reset. 

The second issue is, Vcore stays at 1018mV... 

In essence theres 0 reason for any of you to attempt to flash this bios on your card. It doesnt work 

Another dangerous thing, if you dont have dual bios you will brick the card and youll need to flash it on Linux via another gpu because you will not get vga output outside of Windows, not even bios will show up.


----------



## turbogear (Apr 21, 2021)

Weleleh said:


> So I managed to flash ultimate bios on my toxic in Linux.
> It works but there are some issues...
> 
> First of all driver still knows you dont have a xtxh stepping so if you go past 3000 or 2150 you get driver reset.
> ...


Thanks for the information.
That is exactly what I was saying. XTXH is different product stack so I think it is similar experience you had to people who flashed 6900XT bios onto 6800XT cards leading to bricking their cards.  
There is thread at igor's lab forum where people has already discussed this.



Raendor said:


> Seems like the voltage is more dependent on Min/Max GHz range you set, rather than voltage slider. Wonder if that’s changed anyhow with new drivers. Didn’t test yet.


Yes that is the case. With min max slider you give boost algorithms a reference frequency window where it will try to keep the card.
What I noticed also is that on 6900XT stability is achieved if you give drivers 200MHz window rather 100MHz like on 6800XT.
6900XT has usually more clock variation and if you Ocing it very high it needs larger window because it can get power limited and clock need to go down a little.
This the case for my 6900XTU (Liquid Devil Ultimate).
I have it currently at 2750MHz@1175mV and lower slider at 2550MHz. 2650MHz was causing stability issues in very intensive seen where clock want to go lower than 2650MHz.
I have currently 382W core power setting on it which is default for Unleash bios.
I have not yet played around with MPT.


----------



## Peter Lindgren (Apr 21, 2021)

ChristTheGreat said:


> Thanks for the info. Mine runs most of the time at 875mv, so If I set minimum GPU to 1800, 875mv is not enought :/. I just want to keep high frames steady. Right now, I still play on 1080P, but I guess upgrading to 1440p could help alot to keep high GPU frequency most of the time.
> 
> Note: don't ask why I play on 1080P, I want to keep the card for a long time, with high frames  hahaha


I had the same problem with my MSI MERC 319 6800XT. Whenever I tried to undervolt in Wattman the voltage was stuck to 875mv. The card was very unstsable. So I used More Power Tool and set max voltage there instead. I like my card cool and quiet so max is set to 900mv. Then I set max freq to 2200mhz and min freq to 2100mhz in Wattman. That works very well and then fan is never over 1200rpm


----------



## ChristTheGreat (Apr 21, 2021)

Peter Lindgren said:


> I had the same problem with my MSI MERC 319 6800XT. Whenever I tried to undervolt in Wattman the voltage was stuck to 875mv. The card was very unstsable. So I used More Power Tool and set max voltage there instead. I like my card cool and quiet so max is set to 900mv. Then I set max freq to 2200mhz and min freq to 2100mhz in Wattman. That works very well and then fan is never over 1200rpm


900mv for 2200mhz? Niceee


----------



## aRcane305 (Apr 22, 2021)

So I got the Toxic limited, been playing around with wattman and mpt for ages, still cant hit 2.7Ghz. Any tips for hitting 3ghz? Someone told me it was possible without a block/LN, I still haven't found a way to do so. Only thing I know is that max mem clock is 2150.


----------



## turbogear (Apr 22, 2021)

aRcane305 said:


> So I got the Toxic limited, been playing around with wattman and mpt for ages, still cant hit 2.7Ghz. Any tips for hitting 3ghz? Someone told me it was possible without a block/LN, I still haven't found a way to do so. Only thing I know is that max mem clock is 2150.


I guess it is silicon lottery.  
I had a Toxic before also which I returned.
That was not able to hit more than 2620MHz.
Nothing I tried had helped there also. Pushing core power to 384W and 384A TDC with MPT did not help to increase the limit there.
There are very rare 6900XT samples out there that can reach 2.7GHz.
I have not seen one that can hit 3GHz. I have reading around many forums.
Now with 6900XT made out of XTXH silicon this could be possible. 
I will try to hit 3GHz with my 6900XT Liquid Devil Ultimate. This is an XTXH silicon card and with my initial test I am already hitting 2.7GHz in games. 
My setting is 2750MHz@1175mV and default powerlimit of my card is already 382W with 15% slider in Radeon driver.

With memory setting one need to be careful. These cards allow you to set 2150MHz but in many cases actually Time Spy score gets  lower when you go higher than 2100MHz.
You need to test step by step memory frequencies and check at which point the score start to decrease.
My 6800XT before had max 2110MHz going higher dropped Time Spy score dramatically.
I haven't tried memory oc on my 6900XTU until now.

Since yesterday I have now a new power supply BeQuiet Dark Pro 12 1200W Titanium and now will work on tuning my brand new Liquid Devil.


----------



## Butanding1987 (Apr 22, 2021)

I cleaned my custom loop including the GPU block this week because of some deposit issues. I was shocked with what I found upon opening up the GPU block.They ate it all up.


----------



## Garlic (Apr 22, 2021)

I downloaded mpt today is is a bit confused with what to do, can anyone please tell me thee steps, thanks.


----------



## aRcane305 (Apr 22, 2021)

turbogear said:


> I guess it is silicon lottery.
> I had a Toxic before also which I returned.
> That was not able to hit more than 2620MHz.
> Nothing I tried had helped there also. Pushing core power to 384W and 384A TDC with MPT did not help to increase the limit there.
> ...


Did you try using wattman to bump power up 50%? Right now I'm sliding max clock to 2.7Ghz, mem to 2150mhz, power to 50%, but wattman keeps freezing every time i try.

ok yea cant reach anything. All I'm doing is 15% uncap with wattman. Manual overclock seems to hit 2662Mhz, somehow Toxic Boost is stuck at 2620. So leaving Toxic boost disabled lets it run faster? At the expense of 16Gbs instead of 16.8Gps memory?


----------



## Weleleh (Apr 22, 2021)

My Toxic does 2680/2700 effective on Core on games.
My 6800XT Nitro+ SE did more, around 2750 Mhz effective. 

About memory, both my 6800XT and 6900XT do 2150 Mhz with fast timings but now that you speak about performance degradation, I didn't test for that. Guess I should.

About Toxic Boost, 16.8Gbps is probably memory slider maxed out or close to so manually overclocking it should yield better performance.


----------



## Space Lynx (Apr 22, 2021)

Weleleh said:


> My Toxic does 2680/2700 effective on Core on games.
> My 6800XT Nitro+ SE did more, around 2750 Mhz effective.
> 
> About memory, both my 6800XT and 6900XT do 2150 Mhz with fast timings but now that you speak about performance degradation, I didn't test for that. Guess I should.
> ...



oc'ing vram really doesn't gain anything for games. I leave my vram at default, and OC gpu core only with power slider at 15. 2100 min and 2600 max, and strong fan curve. i typically get 2400-2500mhz in games. that's where the fps comes from, honestly not sure why people bother oc'ing vram, i never seen it give me more fps.


----------



## moproblems99 (Apr 22, 2021)

I just got my reference 6900xt today.  But holy junction temps.  After about 10 minutes in rdr2, my gpu temp was 73C and my junction temp was fluctuating between 102C and 108C.  Cranking the fans up to 90% will get the JT down to 102C and removing the glass panel did nothing.  Errrffff.....I think it is time to just block it.


----------



## Raendor (Apr 22, 2021)

Do I get it right that Radeon software conflicts with msi afterburner? Always get errors in RDR2 when I turn it on. If so, anyone tried to run drivers only and do everything else (tweaking, monitoring) via afterburner instead?


----------



## GamerGuy (Apr 22, 2021)

moproblems99 said:


> I just got my reference 6900xt today.  But holy junction temps.  After about 10 minutes in rdr2, my gpu temp was 73C and my junction temp was fluctuating between 102C and 108C.  Cranking the fans up to 90% will get the JT down to 102C and removing the glass panel did nothing.  Errrffff.....I think it is time to just block it.


What case are you using? Usually, when the fans are cranked up, there should be a noticeable drop in temp, I'd expect that at 90% fanspeed, the JT should be in the 90's, unless airflow isn't ideal.


----------



## moproblems99 (Apr 22, 2021)

GamerGuy said:


> What case are you using?


One of these:


----------



## turbogear (Apr 22, 2021)

aRcane305 said:


> Did you try using wattman to bump power up 50%? Right now I'm sliding max clock to 2.7Ghz, mem to 2150mhz, power to 50%, but wattman keeps freezing every time i try.
> 
> ok yea cant reach anything. All I'm doing is 15% uncap with wattman. Manual overclock seems to hit 2662Mhz, somehow Toxic Boost is stuck at 2620. So leaving Toxic boost disabled lets it run faster? At the expense of 16Gbs instead of 16.8Gps memory?


You cannot bump power by 50% in Wattman.
It is only possible to do 15% there.
To push high one need to use MPT.

When I had 6900XT Toxic I also noticed that when you have Trixx installed and tried to manually tune stuff in Radeon Driver then there were conflicts.


----------



## moproblems99 (Apr 22, 2021)

GamerGuy said:


> What case are you using? Usually, when the fans are cranked up, there should be a noticeable drop in temp, I'd expect that at 90% fanspeed, the JT should be in the 90's, unless airflow isn't ideal.


Additionally, I just put a higher rpm exhaust fan in and removed all glass panels.  68C GPU temp and 104 Junction temp.


----------



## turbogear (Apr 22, 2021)

Butanding1987 said:


> I cleaned my custom loop including the GPU block this week because of some deposit issues. I was shocked with what I found upon opening up the GPU block.They ate it all up.View attachment 197696View attachment 197698View attachment 197699


That looks really scary.


----------



## BarbaricSoul (Apr 22, 2021)

Butanding1987 said:


> I cleaned my custom loop including the GPU block this week because of some deposit issues. I was shocked with what I found upon opening up the GPU block.They ate it all up.View attachment 197696View attachment 197698View attachment 197699


Ok, I'm no expert, but I have never heard of that happening to thermal pads before. It looks like they completely disintegrated. Makes me wonder what brand pads were used (so I'll know to never use them).


----------



## turbogear (Apr 22, 2021)

Garlic said:


> I downloaded mpt today is is a bit confused with what to do, can anyone please tell me thee steps, thanks.


You can find the information on how to use MPT in the following post of mine. 
I also linked there a guide from igor's lab.









						The RX 6000 series Owners' Club
					

Did some more RAM testing on my 6800XT, and it seems that for me 2080mhz with fast timings or 2110 with normal timings is the limit before I start getting performance regression.  I guess this is related to the voltage allowed by AMD for the memory and/or controller, that's probably why it is...




					www.techpowerup.com


----------



## Weleleh (Apr 22, 2021)

So turns out +150 on memory actually causes regression.

+100 gives me the best results. Thanks for the headsup, completely overlooked this.

To finish my headaches going back and forth, I just enabled Toxic Boost which gives +100 to memory and 2660 to core.


----------



## turbogear (Apr 22, 2021)

Weleleh said:


> So turns out +150 on memory actually causes regression.
> 
> +100 gives me the best results. Thanks for the headsup, completely overlooked this.
> 
> To finish my headaches going back and forth, I just enabled Toxic Boost which gives +100 to memory and 2660 to core.


Glad I was able to help. 
I noticed that myself some time ago when I was experimenting with the 6800XT.
Lot of cards only go 2100MHz before you get regression.


----------



## deadman3000 (Apr 22, 2021)

How high are you 6700 XT owners pushing MPT power limit to break past 2800Mhz? I'm using the XFX AIB limit of 211 watts but want to see where the 'safe' limits are without burning out my card lol.


----------



## aRcane305 (Apr 22, 2021)

turbogear said:


> You cannot bump power by 50% in Wattman.
> It is only possible to do 15% there.
> To push high one need to use MPT.
> 
> When I had 6900XT Toxic I also noticed that when you have Trixx installed and tried to manually tune stuff in Radeon Driver then there were conflicts.


Yes i used mpt and set power to 50%, wattman only goes as far as 15%. so does that mean even if i set it in mpt, it wont actually be 50% more power to the card? because wattman sets it at 15%


----------



## Butanding1987 (Apr 23, 2021)

BarbaricSoul said:


> Ok, I'm no expert, but I have never heard of that happening to thermal pads before. It looks like they completely disintegrated. Makes me wonder what brand pads were used (so I'll know to never use them).


The red ants ate them. That explains why I saw them marching toward the PC a couple months ago. I thought there was food inside the case so I used some insecticide chalk on the case and around the GPU block. I didn’t know they were having a feast on the circuit board and had munched on the thermal pads.


----------



## turbogear (Apr 23, 2021)

aRcane305 said:


> Yes i used mpt and set power to 50%, wattman only goes as far as 15%. so does that mean even if i set it in mpt, it wont actually be 50% more power to the card? because wattman sets it at 15%


If you apply higher power in MPT that works. 15% from Radeon driver would be added on top of what you set in MPT.

For example, if you increase in MPT Core Power GPX to 290W and apply 15% in the driver, then this is what the total GPU Core power will be:
290Wx1.15=333.5W

But note that this in only GPU core power. To find total power consumption of your card you need to add about another 45W for other components on the card.
So in the above example that would be around 378.5W total power.

The specified limit for cards with two PCIe power cable is 375W.
So as the above example show if you go higher than 287W on Core then you are running the card out of official PCIe spec.  

Here is how you calculate specified PCIe power depending on number of connectors:
2 connector: 2x150W+75W=375W
3 connector: 3x150W+75W=525W

In the above equation 75W is from the  PCIe slot on mainboard.



Butanding1987 said:


> The red ants ate them. That explains why I saw them marching toward the PC a couple months ago. I thought there was food inside the case so I used some insecticide chalk on the case and around the GPU block. I didn’t know they were having a feast on the circuit board and had munched on the thermal pads.


These ants look dangerous.
They will dig a hole in the GPU core and make their nest for the winter.


----------



## Garlic (Apr 23, 2021)

deadman3000 said:


> How high are you 6700 XT owners pushing MPT power limit to break past 2800Mhz? I'm using the XFX AIB limit of 211 watts but want to see where the 'safe' limits are without burning out my card lol.


My nitro+ can break 2800 without mpt, but I have the power limit set to 240w in mpt anyways. The max frequency that you set in radeon software will always be about 50-100mhz lower in games, I have mine set to 2900 max frequency and 2800 min, the card does about 2810-2840 in games.


----------



## Raendor (Apr 24, 2021)

Do you guys have any problems with afterburner? I basically can't run it with RDR2. The game crashes on start. When it's not running - all good. Bu the problem is that monitoring via afterburner is way better than via radeon software. I suspect that the two just conflict each other, so I wonder if installing driver only version of latest adrenaline and setting clock/voltage settings via afterburner instead will help the issue. Any advice?


----------



## Butanding1987 (Apr 24, 2021)

Raendor said:


> Do you guys have any problems with afterburner? I basically can't run it with RDR2. The game crashes on start. When it's not running - all good. Bu the problem is that monitoring via afterburner is way better than via radeon software. I suspect that the two just conflict each other, so I wonder if installing driver only version of latest adrenaline and setting clock/voltage settings via afterburner instead will help the issue. Any advice?


You don't need Afterburner if all you're after is monitoring. Riva Tuner will do.


----------



## Raendor (Apr 24, 2021)

Butanding1987 said:


> You don't need Afterburner if all you're after is monitoring. Riva Tuner will do.


Rivaruner doesn’t do anything for me if afterburner is not installed and I can’t set all required metrics without afterburner in it. Tried that some time ago.


----------



## GamerGuy (Apr 24, 2021)

Weird, while I have no issue with AB, I do have CTD (even hard locked a couple of times) when I have GPU-Z enabled and running in background to ascertain my max temp/core/mem/etc. It'd either CTD or freeze, forcing me to hit the 'Reset' button. Without GPU-Z running, it's plain sailing...


----------



## Butanding1987 (Apr 24, 2021)

Raendor said:


> Rivaruner doesn’t do anything for me if afterburner is not installed and I can’t set all required metrics without afterburner in it. Tried that some time ago.


You need to learn how to make your own overlay using the overlayeditor plugin in Riva Tuner. There are many tutorials on the web about this. Here's what I made myself. I don't have Afterburner.


----------



## jesdals (Apr 24, 2021)

I was just playing around with some wattage counting from the wall - peak usage 715 watt




But thats with a baseline around 200 Watt for my setup except the pc



The setup also include a secondary desktop and monitor both shut down - but monitor still use some. So a around 515 Watt peak and 450 average gameing in 7680x1440 in Division 2

Will do some direct testing of pc consumption later - but just wanted to se what my setup was using


----------



## dragospetre88 (Apr 24, 2021)

Guys, red devil rx 6900 xt limited edition user here. I have a question. Been trying to find an answer for a while, but with no luck. In short, I found out that in ac valhalla and horizon zero dawn my card can sustain 2700+ MHz 1138 memory with 2650 min - 2750 max 1175 mv, 15% extra power. In any other game, the card cant go higher than 2500 MHz. Both hzd and valhalla stay at approx 290-300 W power consumed, while all the other games draw up to 320w, which is max tdp, but clock speeds dont go higher than 2500 MHz. I use 60% vents, temps 65 degrees max. If I use more power tool to raise max tdp up to 360w, other games can GO to 2650 mhz, hzd and valhalla 2720 max (i get crashes if I set 2700-2800). Now why do those two games can sustain 2700 MHz with no mpt?


----------



## Space Lynx (Apr 24, 2021)

jesdals said:


> I was just playing around with some wattage counting from the wall - peak usage 715 watt
> 
> View attachment 197982
> But thats with a baseline around 200 Watt for my setup except the pc
> ...



the only thing I would improve upon, with that elite of a setup... your monitors are severely lacking. I'd like to see you get a 48" OLED wall mounted at an angle, so you can push back your chair, lean back, and enjoy controller based games in that sexy sexy OLED goodness.

I'm bit of an OLED slut at this point, so don't mind me.  just passing by...






edit:  better yet toss that chair in trash, get a lazy boy recliner, then kick back with the OLED wall mounted. oh sweet mama.


----------



## jesdals (Apr 24, 2021)

lynx29 said:


> edit:  better yet toss that chair in trash, get a lazy boy recliner, then kick back with the OLED wall mounted. oh sweet mama.


Well still have to do some work from home  thats why the 3 monitor setup is the better choise for me - but will consider a better chair


----------



## turbogear (Apr 24, 2021)

dragospetre88 said:


> Guys, red devil rx 6900 xt limited edition user here. I have a question. Been trying to find an answer for a while, but with no luck. In short, I found out that in ac valhalla and horizon zero dawn my card can sustain 2700+ MHz 1138 memory with 2650 min - 2750 max 1175 mv, 15% extra power. In any other game, the card cant go higher than 2500 MHz. Both hzd and valhalla stay at approx 290-300 W power consumed, while all the other games draw up to 320w, which is max tdp, but clock speeds dont go higher than 2500 MHz. I use 60% vents, temps 65 degrees max. If I use more power tool to raise max tdp up to 360w, other games can GO to 2650 mhz, hzd and valhalla 2720 max (i get crashes if I set 2700-2800). Now why do those two games can sustain 2700 MHz with no mpt?


Well I experienced same thing with Valhalla.
With my 6800XT before that was OCed to 2750MHz@1020mV, I used to get clock of 2700MHz in this game with power consumption of around 250W on Core.
Cyberpunk used to maintain 2.7GHz with around 290W core power consumption.
On the other hand Control is a game that pushed card to limit and in that game setting higher than 2675MHz@985mV was unstable and at this setting maximum game clock was 2620MHz and GPU Core was consuming 340W.
Every game is different to how it tasks the GPU.
In games where the GPU is heavily tasked more power can help to maintain higher clocks but not always as Control shows the higher setting are unstable.



jesdals said:


> Well still have to do some work from home  thats why the 3 monitor setup is the better choise for me - but will consider a better chair


I own one of those noblechairs EPIC edition. 
It is good for both gaming and home office.








						EPIC
					

Breathable gaming chair from noblechairs in black with black stitching, 2.3" casters, 4D armrests, Upholstered in 2 kinds of vegan PU leather.




					noblechairs.com


----------



## Raendor (Apr 24, 2021)

Butanding1987 said:


> You need to learn how to make your own overlay using the overlayeditor plugin in Riva Tuner. There are many tutorials on the web about this. Here's what I made myself. I don't have Afterburner.
> 
> View attachment 197981


Thanks, didn’t know about that. but still wonder if its afterburner that’s causing incompatibility rtss as well. Though if rtss works fine for you, then it must be ok to use as standalone.


----------



## turbogear (Apr 24, 2021)

Raendor said:


> Thanks, didn’t know about that. but still wonder if its afterburner that’s causing incompatibility rtss as well. Though if rtss works fine for you, then it must be ok to use as standalone.


I use Rivatuner statistics server for collecting FPS data but together with HWINFO.
There is a possibility to enable overlay in HWINFO to display FPS from Rivatuner together with millions of other sensors that HWINFO is able to access on the computer. 

This way I am able see for example GPU and CPU clocks, power consumption, temperatures, voltages, FPS etc. while in games.


----------



## Space Lynx (Apr 24, 2021)

turbogear said:


> I own one of those noblechairs EPIC edition.
> It is good for both gaming and home office.
> 
> 
> ...



I actually bought a small recliner for my desk, it was like $300 (local furniture store), I just scoot it out a little more whenever I want to kick up the feet and do a controller game or movie. its heaven. lol i can sleep in it too. when scooted in i can use kb/m just fine


----------



## Garlic (Apr 26, 2021)

In MPT I took the power limit to 270w but in games its still the same as what it was before, did I do something wrong or does the card just doesn't draw that much power?
*

*


----------



## Butanding1987 (Apr 26, 2021)

In MPT I took the power limit to 270w but in games its still the same as what it was before, did I do something wrong or does the card just doesn't draw that much power?


Garlic said:


> *View attachment 198152*


Did you click "Write SPPT"? You also need to restart for the settings to take effect. Also, your card won't draw more power unless you overclock the core or VRAM.


----------



## Garlic (Apr 26, 2021)

Butanding1987 said:


> In MPT I took the power limit to 270w but in games its still the same as what it was before, did I do something wrong or does the card just doesn't draw that much power?
> 
> Did you click "Write SPPT"? You also need to restart for the settings to take effect. Also, your card won't draw more power unless you overclock the core or VRAM.


Yes I did all of above and its overclocked, maybe its because of my GPU having bad vrms thaats unable to give the gpu more power?


----------



## Butanding1987 (Apr 27, 2021)

I've changed to Thermalright pads (12.8w/mk) and I noticed a considerable drop in both core and hot spot temperatures. My new average temp after a Port Royal run is now about 46 degrees C compared with 54 degrees C before. I know it doesn't translate to any real-world benefits but it's good to stare at a better number (and I mean the temp, not the performance numbers which I'm sure haven't changed much).


----------



## turbogear (Apr 27, 2021)

Butanding1987 said:


> I've changed to Thermalright pads (12.8w/mk) and I noticed a considerable drop in both core and hot spot temperatures. My new average temp after a Port Royal run is now about 46 degrees C compared with 54 degrees C before. I know it doesn't translate to any real-world benefits but it's good to stare at a better number (and I mean the temp, not the performance numbers which I'm sure haven't changed much).
> 
> View attachment 198251
> 
> View attachment 198253


That's nice. 
Your new temperatures are comparable to what I had on my 6800XT.

But I don't understand why changing the pads for VRM and VRAM will make your core and junction temperature better except if you meant that now you use a thermal pad on the GPU chip instead of thermal past. 

I was using 12w/mK pads for VRM and VRAM and Thermal Grizzly Kryonaut for the GPU chip.


----------



## Butanding1987 (Apr 27, 2021)

turbogear said:


> That's nice.
> Your new temperatures are comparable to what I had on my 6800XT.
> 
> But I don't understand why changing the pads for VRM and VRAM will make your core and junction temperature better except if you meant that now you use a thermal pad on the GPU chip instead of thermal past.
> ...



You're right. LOL. My memory junction temp stays in the low 40s now. About my core temp, I forgot to mention that I shifted from Kryonaut to Conductonaut liquid metal. Our ambient temp here is about 35 degrees C (yes, it's hot and humid).


----------



## turbogear (Apr 27, 2021)

Butanding1987 said:


> You're right. LOL. My max junction temp is now in the 60s from the high 70s earlier. About my core temp, I forgot to mention that I shifted from Kryonaut to Conductonaut liquid metal.


That makes sense now. 
I haven't used Conductonaut since some time. 
I used it last time on i7 7700K after de-liding it. 

I hope you used some isolation paint on the components on the GPU chip that are close to the exposed thermal pad.


----------



## Butanding1987 (Apr 27, 2021)

turbogear said:


> That makes sense now.
> I haven't used Conductonaut since some time.
> I used it last time on i7 7700K after de-liding it.
> 
> I hope you used some isolation paint on the components on the GPU chip that are close to the exposed thermal pad.


I did use nail polish and kapton tape around the die. I also applied just enough so it doesn't flow into other parts (especially since the graphics card is sitting vertically).


----------



## dragospetre88 (May 3, 2021)

Hello. Guys, what are your max junction temps on rx 6900 xt? On my red devil they reach 105 degrees when max overclock ed (2650 MHz, 2138 memory) and about 85 degrees without power limit slider to The max. 20 degrees plus for a 100 MHz boost in clock not worth it... But still, why 105 degrees with max power slider 15%?!


----------



## turbogear (May 3, 2021)

dragospetre88 said:


> Hello. Guys, what are your max junction temps on rx 6900 xt? On my red devil they reach 105 degrees when max overclock ed (2650 MHz, 2138 memory) and about 85 degrees without power limit slider to The max. 20 degrees plus for a 100 MHz boost in clock not worth it... But still, why 105 degrees with max power slider 15%?!


105°C is really high.  
Are you at default fan profile?
When I had a normal 6900XT Red Devil, it went to 95°C with custom fan profile at 2675MHz@1075mV but I had 384W Core power and 384A TDC set in MPT.
With that high power setting I never hit higher temperatures as I used custom fan profile where fan was spinning at 90% for temperatures higher than 90°C.

I have heard at number of forums sometimes people had cards where they had high temperatures due to some issues associated with build quality.
Especially there were users of Liquid Devil cards who had very high temperatures. 

One person at Igor's lab opened his Liquid Devil and applied new past to lower temperatures, but that would avoid warranty at least in Germany.
There was another owner of 6900XT Liquid Devil Ultimate who applied Liquid Metal to lower the temperatures.

After my 6800XT died which was custom water cooled, I try in future  to be more careful to open the card and lose the warranty especially one need to sell a Kidney to buy a new one again.


----------



## dragospetre88 (May 3, 2021)

No matter what fan curve i choose, junction temps hit 100+ degrees in some games when I use 2600-2700 MHz and +15% power profile. And that s without messing up with mpt, so power maxes up at around 323 W. Normal temps (non junction sit below 70 degrees even with fan at 50% which is barely audible. Maybe some of my gpu memory modules arent properly cooled or smth... But as soon as Iower the power bar to 0, junction drops 20 degrees up to 80-85 degrees. What to do?


----------



## turbogear (May 3, 2021)

dragospetre88 said:


> No matter what fan curve i choose, junction temps hit 100+ degrees in some games when I use 2600-2700 MHz and +15% power profile. And that s without messing up with mpt, so power maxes up at around 323 W. Normal temps (non junction sit below 70 degrees even with fan at 50% which is barely audible. Maybe some of my gpu memory modules arent properly cooled or smth... But as soon as Iower the power bar to 0, junction drops 20 degrees up to 80-85 degrees. What to do?


The junction temperature is not related to memory modules.
It is temperature of the GPU chip hot spot. There are many temperature sensors inside the GPU chip and as far as I understood this junction temperature shows the hottest spot inside the GPU chip.
To me this looks like an issue of no proper contact of GPU with heatsink.
Maybe you should contact the shop where you bought it and they can offer a replacement.
If you try to repair it yourself by opening it and applying new thermal paste you could void warranty.


----------



## dragospetre88 (May 3, 2021)

Ok. Thank you very much!


----------



## turbogear (May 3, 2021)

dragospetre88 said:


> Ok. Thank you very much!


I hope the shop will help you and don't ask you to wait for weeks until they replace it. 

On the hand, there is a big silicon lottery out there when it comes to good Ocing cards. You need to take into account that your replacement card could be either better or worst in OC performance that is hard for anybody to predict.


----------



## dragospetre88 (May 3, 2021)

But still if there s an issue of contact between gpu and heatsink shouldnt i ser high junction all The Time and not only when i push The power slider all The way up?


----------



## Felix123BU (May 3, 2021)

dragospetre88 said:


> But still if there s an issue of contact between gpu and heatsink shouldnt i ser high junction all The Time and not only when i push The power slider all The way up?


Yes and no, I took my reference cooler of my 6800XT, peeled the thermal pad of, put paste on it, and at under 150w loads it was fine, at over 150w loads the hotspot got out of hand really fast really bad. (110c bad)
That told me that the cold plate and chip did not make 100% contact with paste, or at least some small part of it. No matter what I tried, I could not bring hotspot back to normal. Got a water-block, hotspot is now nice and cool.

The reason some of these cards come with a thermal pad on the GPU instead of paste is that for some models the cold plate is not very even, so a thicker medium is needed to make contact all over the chip.
Not sure what your model is, but it could be that from factory there is not good enough contact and in very high wattage scenarios the cooler just cant keep up because of uneven contact.

Also, these card guarantee a certain speed at a certain power limit, so if the hotspot gets out of hand only when overclocking, you might be out of luck with warranty, but I would send it to repair or replace without mentioning the OC part


----------



## dragospetre88 (May 3, 2021)

I have a powercolor red devil rx 6900 xt limited edition. Indeed, i used mpt to push power over 360 W in some scenarios...Finally, I settled for stock power parameters, with only power slider maxed. But what you told me might indeed be the case: past a certain wattage, my card might not function properly any longer.


----------



## Felix123BU (May 3, 2021)

dragospetre88 said:


> I have a powercolor red devil rx 6900 xt limited edition. Indeed, i used mpt to push power over 360 W in some scenarios...Finally, I settled for stock power parameters, with only power slider maxed. But what you told me might indeed be the case: past a certain wattage, my card might not function properly any longer.


Cant say I am an expert on that model, but what you are describing sound like either your case is badly ventilated, which would make hot air get recirculated and that would drastically regress the cooling performance (from what I read above, that's not the case), or  some issue with the mounting of the cooler. That cards cooler should be good enough to allow it to OC easily without turning it into an oven, so my guess is the cooler is not ideally mounted, hence at higher wattage it craps out basically. With the risk of voiding warranty, you could try to screw the 4 main screws tighter, or add some washers, but if your card has warranty seals, those would be broken during that process, so up to you how you want to handle it. In my opinion, 100-200 extra mhz is not that big of performance jump to merit all the hassle


----------



## Aranarth (May 3, 2021)

I got a Radeon 6800xt, in a dell Alienware r10 does that count?


----------



## turbogear (May 3, 2021)

Aranarth said:


> I got a Radeon 6800xt, in a dell Alienware r10 does that count?


Of course it counts.
Welcome to the club.



dragospetre88 said:


> I have a powercolor red devil rx 6900 xt limited edition. Indeed, i used mpt to push power over 360 W in some scenarios...Finally, I settled for stock power parameters, with only power slider maxed. But what you told me might indeed be the case: past a certain wattage, my card might not function properly any longer.


As I said before, I had 6900XT Red Devil before that I returned.
I returned it as it was not much faster than my 6800XT that OCed like crazy.  
I pushed that Red Devil at 2675MHz@1075mV with 384W and 384A TDC with MPT but highest temperature was only 95°C.

It was actually varying between 88°C and 95°C under these OC conditions and did not go higher.

As I mentioned earlier it seems like there is contact problems between cooler and GPU for your case.
For higher power setting the leakage increases and if cooler is not having good contact the heat will not be removed effectively.

As @Felix123BU mentioned, AMD is using a thermal pad between GPU and heatsink on reference design and not a thermal paste. Thermal pad helps in case the surface of heatsink is not even and avoids issues with it not completely touching the GPU.
The custom cards use thermal paste and I heard many complains from owners in different forums that temperatures are very high on their custom cards.


----------



## Aranarth (May 3, 2021)

BTW, i like the 6800xt, I got a friend who knew I had a rx580 8gb, I told him I was finally playing borderlands 3 for the time and I got a new gigabyte ultra wide screen monitor (3440x1440)...

"Oh? so what setting you running the game at? Isn't it laggy?"
"All of em to the max....  not laggy at all 100+ FPS"
"WTF?!"
Then told him I got a Radeon 6800xt with my new machine running a Ryzen 5600xt 
"Well, CRAP!"


----------



## Felix123BU (May 3, 2021)

Aranarth said:


> BTW, i like the 6800xt, I got a friend who knew I had a rx580 8gb, I told him I was finally playing borderlands 3 for the time and I got a new gigabyte ultra wide screen monitor (3440x1440)...
> 
> "Oh? so what setting you running the game at? Isn't it laggy?"
> "All of em to the max....  not laggy at all 100+ FPS"
> ...


Also have a ultra wide screen monitor (3440x1440)(Benq ex3501r) with the 6800XT, perfect match


----------



## GamerGuy (May 4, 2021)

Aranarth said:


> I got a Radeon 6800xt, in a dell Alienware r10 does that count?


A belated welcome to the club, where RX 6000 card owners sip champagne and munch on caviar!  (I wish!) 

I have a rather aggressive fan setting on mine, with ambient temp in my room around 23C, my junction temp is around 80C with my core temp in the mid 60's. I dunno if the new driver or my recently serviced aircon (they did a total chemical wash on the blower) but temps are pretty good now.


----------



## Aranarth (May 4, 2021)

I have not really looked at overclocking the card. Does not seem to need it.

The CPU has PBO turned on and ram is running at 3200.

I have replaced the front fan and water cooler fan with some high-end corsair fans (radiator is now in fan sandwich) so the machine is quieter.
Other than that I'm happy with the performance.

Bios does not appear to have latest agesa so resizable BAR support is not available.
Has some other quirks such as if you do a hard shut down holding in the power button, you have reset the bios to get it to boot back up again.
Typical Dell.

hopefully in 3-4 years if I can build a new computer from scratch again.


----------



## GamerGuy (May 4, 2021)

Played COD MW WZ SP campaign for about 1 hour, here's the max sensor reading with GPU-Z, the memory clock seems wonky though, which kinda raises doubts about everything. Still, as I was playing the game, and glancing at MSI AB, temps were always in the 70's, don't recall it hitting 80C at all.


----------



## jesdals (May 4, 2021)

6900XTX cards in stock in Denmark - though at a hefty mark up





						Grafikkort. Køb dit næste GFX / GPU til gaming her - Bedst i Test
					

Hos os finder du et stort udvalg af grafikkort / GPU / GFX. Vi har grafikkort fra de kendte mærker, som er bedst i test, og til de bedste priser. Fysisk butik




					www.proshop.dk


----------



## Felix123BU (May 4, 2021)

jesdals said:


> 6900XTX cards in stock in Denmark - though at a hefty mark up
> 
> 
> 
> ...


Well, Lisa Su said they are going to ramp up the 6000 series, she did not say anything about the price


----------



## turbogear (May 4, 2021)

jesdals said:


> 6900XTX cards in stock in Denmark - though at a hefty mark up
> 
> 
> 
> ...


Yes the prices are higher than in Germany.
Mindfactory is selling 6900XTU Liquid Devil Ultimate for 2299€.
I paid that price for mine. 



			https://www.mindfactory.de/product_info.php/16GB-Powercolor-Radeon-RX-6900-XTU-Liquid-Devil-Ultimate-DDR6--Retail-_1405065.html


----------



## GamerGuy (May 4, 2021)

turbogear said:


> Yes the prices are higher than in Germany.
> Mindfactory is selling 6900XTU Liquid Devil Ultimate for 2299€.
> I paid that price for mine.
> 
> ...


Holy...!!!    And here I was sputtering and damn near having convulsions at having to pay 1350USD for my (by now) vanilla Nitro+ RX 6900 XT! Those cards cost more than twice what I'd paid for mine!!

 *Grabs chest and collapse*


----------



## Felix123BU (May 4, 2021)

turbogear said:


> Yes the prices are higher than in Germany.
> Mindfactory is selling 6900XTU Liquid Devil Ultimate for 2299€.
> I paid that price for mine.
> 
> ...


Was going to say a sacrilegious thing, for 2299 EUR you could have gotten a 3090....but I just checked, those are starting from 3200 EUR    Insane prices 



GamerGuy said:


> Holy...!!!    And here I was sputtering and damn near having convulsions at having to pay 1350USD for my (by now) vanilla Nitro+ RX 6900 XT! Those cards cost more than twice what I'd paid for mine!!
> 
> *Grabs chest and collapse*


Well, I got my 6800XT for 710 EUR, in hindsight that was dirt cheap


----------



## turbogear (May 4, 2021)

Felix123BU said:


> Was going to say a sacrilegious thing, for 2299 EUR you could have gotten a 3090....but I just checked, those are starting from 3200 EUR   Insane prices
> 
> 
> Well, I got my 6800XT for 710 EUR, in hindsight that was dirt cheap


Yes, I looked at 3090 at that time but the prices were even crazier.  

For my XFX 6800XT, I paid only 299€ delta. 
I bought it for 999€ and sold Radeon VII for 700€.

If my 6800XT did not get damaged by ESD shock  I would not have bought 6900XTU.

As a matter of fact, I bought first 6800XT Red Devil as replacement for the dead XFX 6800XT, but this sample of 6800XT Red Devil was not good. It did not OC over 2400MHz and even there showed artifacts when running Time Spy.

My XFX 6800XT used to run at 2750MHz@1020mV. 
So I was pissed off and got crazy after returning Red Devil and spent damn lot of money on high performing 6900XTU and did not wanted to play silicon lottery by ordering another 6800XT instead of Red Devil. 

By the way, I was looking today at what 6900XT Red Devil regular card cost now. It is now 1799€ and it used to cost 1399€ in mid of March. It went up by 300€ since then. 



			https://www.mindfactory.de/product_info.php/16GB-Powercolor-Radeon-RX-6900XT-Red-Devil-DDR6-Triple-Cooler--Retail-_1388403.html


----------



## Felix123BU (May 4, 2021)

turbogear said:


> Yes, I looked at 3090 at that time but the prices were even crazier.
> 
> For my XFX 6800XT, I paid only 299€ delta.
> I bought it for 999€ and sold Radeon VII for 700€.
> ...


Just a couple of days ago the gloom and doom YouTube squad was saying soon GPU's will start getting to normal levels, I guess Ethereum wanted to disagree with them, as long as Ethereum goes up, GPU prices will also follow  

When I got the 6800XT I was like damn, that's the most I have ever paid for a GPU, never higher, but then again, said the same thing when I paid 500 for the Vega64, and now 710 EUR for a 6800XT is a dream no one could even dream to achieve....


----------



## moproblems99 (May 5, 2021)

EK Quantum Block, Backplate, and a 560mm rad just arrived!


----------



## Felix123BU (May 5, 2021)

moproblems99 said:


> EK Quantum Block, Backplate, and a 560mm rad just arrived!


Out of curiosity, what case do you have that supports a 560mm rad?    And enjoy the new cooling!


----------



## moproblems99 (May 5, 2021)

Felix123BU said:


> Out of curiosity, what case do you have that supports a 560mm rad?    And enjoy the new cooling!


Tower 900.  I can sleep in it still after the radiator.


----------



## Felix123BU (May 5, 2021)

moproblems99 said:


> Tower 900.  I can sleep in it still after the radiator.


I like Thermaltake cases, my favorite case manufacturer, sadly had to give my Thermaltake Chaser MK-1 to the wife since I wanted a full loop for the 6800XT, and I would have had to do a lot of hard modding to that case to make it fully water friendly, but still love it


----------



## moproblems99 (May 5, 2021)

Felix123BU said:


> I like Thermaltake cases, my favorite case manufacturer, sadly had to give my Thermaltake Chaser MK-1 to the wife since I wanted a full loop for the 6800XT, and I would have had to do a lot of hard modding to that case to make it fully water friendly, but still love it


I am not impressed with the build quality but the case is functional.


----------



## xenosys (May 5, 2021)

turbogear said:


> Yes the prices are higher than in Germany.
> Mindfactory is selling 6900XTU Liquid Devil Ultimate for 2299€.
> I paid that price for mine.
> 
> ...



Speaking of which, I'm keen to see how your overclocking sessions and performance figures go as I may get a Liquid Devil Ultimate.  Have you received your new PSU yet and ran some tests?


----------



## turbogear (May 5, 2021)

xenosys said:


> Speaking of which, I'm keen to see how your overclocking sessions and performance figures go as I may get a Liquid Devil Ultimate.  Have you received your new PSU yet and ran some tests?


I got a new supply Be quiet Dark Power Pro 12 Platinum.

I did not do any further tests though or anything special with my computer since 10 days. 
The last weeks were difficult because of COVID affecting family members back at my home in India. 
I don't have the mode and energy to do much with the computer as my thoughts are somewhere else at the moment. 
Hopefully all will go well over there. 
The situation is not easy as you might have heard from the media.


----------



## xenosys (May 5, 2021)

turbogear said:


> I got a new supply Be quiet Dark Power Pro 12 Platinum.
> 
> I did not do any further tests though or anything special with my computer since 10 days.
> The last weeks were difficult because of COVID affecting family members back at my home in India.
> ...



Yep, hope you and family are well.  Best wishes.


----------



## turbogear (May 5, 2021)

xenosys said:


> Yep, hope you and family are well.  Best wishes.


Thanks a lot for well wishes. Me and my wife and children in Germany are fine but multiple relatives are at hospital back in India. 

If you want to look how good 6900XTU Liquid Devil Ultimate can perform you can visit the link below.

There are people at this German forum pasting their scores.
The top spot with 23338 Time Spy score belongs to the owner of a 6900XTU Liquid Devil Ultimate. 
That is an amazing score. The owner L!ME has changed his thermal paste with Liquid metal.








						Radeon RX 6000: 3DMark Time Spy Rangliste
					

Weitere Ranglisten: 3DMark Port Royal | 3DMark Time Spy Effizienz  Schnellreise: Multi-GPU | 6950 KXTX | 6900 XTXH/Mod | 6900 XT | 6800 XT | 6800 | 6750 XT | 6700 XT | 6650 XT | 6600 XT | 6600 Radeon RX 6000:  Time Spy Rangliste    Besitzer einer Radeon 6600 (XT), 6650 XT, 6700 XT, 6750 XT, 6800...




					www.hardwareluxx.de
				




Time Spy score of L!ME:








						I scored 22 202 in Time Spy
					

AMD Ryzen 9 5950X, AMD Radeon RX 6900 XT x 1, 32768 MB, 64-bit Windows 10}




					www.3dmark.com


----------



## Felix123BU (May 5, 2021)

turbogear said:


> Thanks a lot for well wishes. Me and my wife and children in Germany are fine but multiple relatives are at hospital back in India.
> 
> If you want to look how good 6900XTU Liquid Devil Ultimate can perform you can visit the link below.
> 
> ...


Don't tell me about liquid metal, I am soooo tempted to do that for my 6800XT, still have some Coollaboratory Liquid Pro at hand


----------



## turbogear (May 5, 2021)

Felix123BU said:


> Don't tell me about liquid metal, I am soooo tempted to do that for my 6800XT, still have some Coollaboratory Liquid Pro at hand


It is very tempting indeed.
I also still have some Thermal Grizzly Conductonaut,  but after my last card failing and need to send it back I will be careful not to mode it in an irreversible way.
With LM one need to use some protection paint to isolate the other components from short circuit.
I am not sure how easy it is to remove the paint if one need to send the card back.


----------



## Felix123BU (May 5, 2021)

turbogear said:


> It is very tempting indeed.
> I also still have some Thermal Grizzly Conductonaut,  but after my last card failing and need to send it back I will be careful not to mode it in an irreversible way.
> With LM one need to use some protection paint to isolate the other components from short circuit.
> I am not sure how easy it is to remove the paint if one need to send the card back.


I always used some other non conductive thermal paste as insulation, basically covered the circuitry close to it in mostly Arctic MX2 which I have like 4 tubes, its easy to remove, does not flow away, and I never had one incident, liquid metaled 3 CPUs and 4 GPUs, never had one issue with this method, but its not as safe as something like clear nail polish for example, though if done right, as effective in protecting circuitry  

And since my GPU block is nickel plated, liquid metal should be fine, I have to move all components to a new case anyway, so I am reasonably sure my 6800XT will receive the blessing of metal


----------



## Butanding1987 (May 6, 2021)

Since using liquid metal on my Nitro+, I noticed that my core temp has barely moved past 50 degrees Celsius under load. It mostly stays at 44-45. My hot spot temp used to go past 70 when running Fire Strike, but now it only maxes out in the mid-60s. My ambient temp is pretty high at 37 degrees Celsius (very hot here).

Many people seem to frown at the thought of using LM on a GPU given the risks. I think with a little bit of common sense, you should be fine.


----------



## wolf (May 6, 2021)

Butanding1987 said:


> Many people seem to frown at the thought of using LM on a GPU given the risks. I think with a little bit of common sense, you should be fine.


Did you do anything to protect the areas around the die itself, or just being very cautious in the application of the LM?


----------



## GamerGuy (May 6, 2021)

Butanding1987 said:


> Many people seem to frown at the thought of using LM on a GPU given the risks. I think with a little bit of common sense, you should be fine.


Problem for me is, my Nitro+ is a local set with warranty given by the local distributer (hence, I'd paid more than the XFX MERC319 RX 6900 XT from Amazon I'd wanted to get), any tampering with the warranty stickers/seal would automatically void the warranty (which isn't covered by Sapphire, it's covered thru the local distributer, so if they decide that the seals have been tampered with, warranty's null and void). 

IF I had gotten, say, the MERC 319 RX 6900 XT as I'd intended from Amazon, and it's covered with a US warranty, I'd not mind trying what you'd done....


----------



## Butanding1987 (May 6, 2021)

wolf said:


> Did you do anything to protect the areas around the die itself, or just being very cautious in the application of the LM?


I used nail polish around the GPU die (applied 3x) and covered the entire area with Kapton tape. I also applied just enough liquid metal on the die and GPU block so it doesn't flow into the other parts. The Nitro+ was vertically installed.


GamerGuy said:


> Problem for me is, my Nitro+ is a local set with warranty given by the local distributer (hence, I'd paid more than the XFX MERC319 RX 6900 XT from Amazon I'd wanted to get), any tampering with the warranty stickers/seal would automatically void the warranty (which isn't covered by Sapphire, it's covered thru the local distributer, so if they decide that the seals have been tampered with, warranty's null and void).
> 
> IF I had gotten, say, the MERC 319 RX 6900 XT as I'd intended from Amazon, and it's covered with a US warranty, I'd not mind trying what you'd done....


I thought there's a US law that bars distributors/manufacturers from voiding warranty just because the card was opened? About the stickers, I guess you can always claim that they weren't there in the first place.


----------



## GamerGuy (May 6, 2021)

Butanding1987 said:


> I thought there's a US law that bars distributors/manufacturers from voiding warranty just because the card was opened? About the stickers, I guess you can always claim that they weren't there in the first place.


Problem is, I ain't in the US of A. I'm somewhere in the deep dark heart of Asia. That's why I'd said IF I had gotten the XFX 319 RX 6900 XT from Amazon (USA), I'd not mind trying out LM as I know US sets have more laxed warranty coverage.


----------



## Raendor (May 6, 2021)

After few weeks of ownership I finally tweaked the card to what seems a stable and good enough balance between clocks/temps/power. I left the card pretty much within stock 2300-2400 range at 1075 consuming around 200w with jumps to 220 highest. Think I’m not gonna bother further. Raw power is good.

what I absolutely hate though is how amd drivers work around vsync/vrr/fps limiting. 2 examples:
1) fallout 4 gets capped to 48 fps or lower  (refresh rate divided by 3) no matter which settings I set in driver and I have to remove vsync altogether within .ini. But then it’s a problem of fps limiter, because DRTC can’t be set per game and Chill doesn’t work properly for hacking/lockpicking.
2) hitman 3 caps at 48 too (so lowest freesync border) if I choose 50% vsync (game runs at too high fps I don’t need it to).

on nvidia side I could choose per game whether game dictates vsync and could set fps limit per game too. Never had issues like these. Definitely software side keeps disappointing me at otherwise beastie RX 6800 XT.


----------



## Felix123BU (May 6, 2021)

Raendor said:


> After few weeks of ownership I finally tweaked the card to what seems a stable and good enough balance between clocks/temps/power. I left the card pretty much within stock 2300-2400 range at 1075 consuming around 200w with jumps to 220 highest. Think I’m not gonna bother further. Raw power is good.
> 
> what I absolutely hate though is how amd drivers work around vsync/vrr/fps limiting. 2 examples:
> 1) fallout 4 gets capped to 48 fps or lower  (refresh rate divided by 3) no matter which settings I set in driver and I have to remove vsync altogether within .ini. But then it’s a problem of fps limiter, because DRTC can’t be set per game and Chill doesn’t work properly for hacking/lockpicking.
> ...


For 2300-2400 range I think you could go a bit lower on voltage, 1040 or 1060 should work, but of course it depends a lot on the chip bin. Some can go lower, some need higher. I run my 6800XT at 2600-2700 at 1040mv, 100% stable.

The Vsync setting you can set per game, but the FPS limiter is only global, which is a pity, it would have been really useful to have the FPS limiter per game. In a way Vsync is a FPS limiter as in it will lock FPS to your display max, I use Vsync when Radeon Chill is not working properly and need some frame reduction. I love Radeon Chill, excellent function, only that in some games its sort of working, though in most it works fine.


----------



## Raendor (May 6, 2021)

Felix123BU said:


> For 2300-2400 range I think you could go a bit lower on voltage, 1040 or 1060 should work, but of course it depends a lot on the chip bin. Some can go lower, some need higher. I run my 6800XT at 2600-2700 at 1040mv, 100% stable.
> 
> The Vsync setting you can set per game, but the FPS limiter is only global, which is a pity, it would have been really useful to have the FPS limiter per game. In a way Vsync is a FPS limiter as in it will lock FPS to your display max, I use Vsync when Radeon Chill is not working properly and need some frame reduction. I love Radeon Chill, excellent function, only that in some games its sort of working, though in most it works fine.


Nope, crashes in RDR2, which turned out to be quite a good benchmark as it either crashes immediately or within 15 mins of gameplay depending on how "bad" settings are. The highest real voltage under these settings is 1031 according to hwinfo.


----------



## Felix123BU (May 6, 2021)

Raendor said:


> Nope, crashes in RDR2, which turned out to be quite a good benchmark as it either crashes immediately or within 15 mins of gameplay depending on how "bad" settings are. The highest real voltage under these settings is 1031 according to hwinfo.


1040mv or 1060mv? Out of curiosity, when you say 2300-2400 range, are you saying you use those exact values as upper and lower limit?


----------



## Raendor (May 6, 2021)

Felix123BU said:


> 1040mv or 1060mv? Out of curiosity, when you say 2300-2400 range, are you saying you use those exact values as upper and lower limit?


Either. 1060 crashes within 5 mins and 1050 crashes almost immediately. I can run 2275-2375 at 1070, but that's it, so I left it at 2300-2400 and 1075 for assurance. With these I can play for hours, no trouble.

These are the exact numbers I set in Wattman for low clock/high clock and voltage.


----------



## Felix123BU (May 7, 2021)

Could not resist, and since I had a new case coming in and some new radiator fans, I put Liquid Metal (Coolaboratory Liquid Pro) between the GPU block (Alphacool Eisblock Aurora) and my 6800XT.....
The results blew me of my chair (literally since one of the wheels came off and I almost broke my neck  ) after 30 minutes of extreme burn tests, Furmark with settings that constantly sucked around 330w, the GPU temp had a max of 45c, and the hotspot a max of 60c (mostly 59) coupled with a CPU burn in to have the loop water as warm as possible. These results are a 15c reduction in hotspot temps, and a 10c reduction in GPU temp. It would probably been a couple of degrees lower with only the 6800XT frying and the CPU sitting idle.

Same decrease in Cyberpunk, where before GPU was at around 52, now it at 41, and hotspot which use to be around 72ish now is 56c ..... and the game using on average around 300w. I did LM on other cards before, but this one shows by far the most impressive results


----------



## Butanding1987 (May 8, 2021)

Don’t tempt turbogear too much. He’s desperate to improve his 94th global ranking in Fire Strike Ultra.


----------



## GerKNG (May 8, 2021)

anybody else very disappointed with the 6900XT/6800XT Nitro +?
The cooler is absolute garbage and barely on par with the reference card.
and the heatsink is basically made for a 5700XT. i wish i could swap my card out for a different one (my 319 merc black is a strix vs. asus dual fan compared to the nitro +)


----------



## turbogear (May 8, 2021)

Butanding1987 said:


> Don’t tempt turbogear too much. He’s desperate to improve his 94th global ranking in Fire Strike Ultra.View attachment 199613


No no not anymore. 

It is tempting as the 6900XTU has lot of potential but I put a target for maximum power of 384W and 384A TDC and will tune whatever performance I can squeeze out of the card in that target. 

If you give this monster more power it can go higher in performance but not worth it as power consumption is too high then.
Let's see where I end up in the global chart after tuning. 

At the moment it is set to 2750MHz@1175mV at 384W and 320A TDC. I could not  find any stable undervolt than 1175mV at this frequency. It can be set 2850MHz at higher voltage but does not bring any benefits at 384W as it needs higher power to give more performance.


----------



## BarbaricSoul (May 8, 2021)

GerKNG said:


> anybody else very disappointed with the 6900XT/6800XT Nitro +?
> The cooler is absolute garbage and barely on par with the reference card.
> and the heatsink is basically made for a 5700XT. i wish i could swap my card out for a different one (my 319 merc black is a strix vs. asus dual fan compared to the nitro +)



don't know about the other Nitro+ cards, but I'm really happy with my Nitro+ 6700XT cooler performance. Up to 2.9 GHz and stays under 70'c


----------



## Felix123BU (May 8, 2021)

turbogear said:


> No no not anymore.
> 
> It is tempting as the 6900XTU has lot of potential but I put a target for maximum power of 384W and 384A TDC and will tune whatever performance I can squeeze out of the card in that target.
> 
> ...


And your first 6800XT could do 2700mhz at 998mv .....  At least you have a very good chip in this 6900XT


----------



## turbogear (May 8, 2021)

Felix123BU said:


> And your first 6800XT could do 2700mhz at 998mv .....  At least you have a very good chip in this 6900XT


When I remember that 6800XT card I can start crying. 
I never got anything as good as that. Pity that it died.

But positive side is that at 2700MHz real game frequency this 6900XTU is much faster. 
The other two 6900XT that I had did not even run at 2620MHz with 384W and 384A TDC.

6900XT are not as energy efficient as good 6800XT, but you know my second 6800XT Red Devil was real disappointing with only 2400MHz and did not undervolt at all even at that frequency.


----------



## Raendor (May 9, 2021)

Noticed that after last driver my power consumption decreased, fan speed too a bit and gpu utilization is now within 94-99 range, while it used to be in 97-99 before. Also, rolled back to 21.4.1 after couple DDUs and the behavior persists. Despite all the settings are same I used to have. Find it weird. 21.5.1 for whatever reason seems to have lower performance in hitman 3 (reproducible in benchmarks). Guess I’ll be waiting for next version instead.


----------



## Felix123BU (May 9, 2021)

Raendor said:


> Noticed that after last driver my power consumption decreased, fan speed too a bit and gpu utilization is now within 94-99 range, while it used to be in 97-99 before. Also, rolled back to 21.4.1 after couple DDUs and the behavior persists. Despite all the settings are same I used to have. Find it weird. 21.5.1 for whatever reason seems to have lower performance in hitman 3 (reproducible in benchmarks). Guess I’ll be waiting for next version instead.


21.3.1 is still the best driver performance wise, for me the difference between 21.3.1. and 21.4.1 is minimal, but still I think 21.3.1 is a tiny bit better performing. I am very very curios if the next big one brings FSR as its rumored to come out in June


----------



## xenosys (May 9, 2021)

Raendor said:


> Noticed that after last driver my power consumption decreased, fan speed too a bit and gpu utilization is now within 94-99 range, while it used to be in 97-99 before. Also, rolled back to 21.4.1 after couple DDUs and the behavior persists. Despite all the settings are same I used to have. Find it weird. 21.5.1 for whatever reason seems to have lower performance in hitman 3 (reproducible in benchmarks). Guess I’ll be waiting for next version instead.



I've been experiencing issues with performance in benchmarks and games since the early Feb driver release.  I only updated recently as the latest driver supposedly gives Resident Evil 8 a nice performance boost on the 6800XT's, but once completed I may switch back to that driver release as it seemed to be provide slightly higher clocks and better overall performance.


----------



## dragospetre88 (May 9, 2021)

Felix123BU said:


> Cant say I am an expert on that model, but what you are describing sound like either your case is badly ventilated, which would make hot air get recirculated and that would drastically regress the cooling performance (from what I read above, that's not the case), or  some issue with the mounting of the cooler. That cards cooler should be good enough to allow it to OC easily without turning it into an oven, so my guess is the cooler is not ideally mounted, hence at higher wattage it craps out basically. With the risk of voiding warranty, you could try to screw the 4 main screws tighter, or add some washers, but if your card has warranty seals, those would be broken during that process, so up to you how you want to handle it. In my opinion, 100-200 extra mhz is not that big of performance jump to merit all the hassle


I had someone reapplying thermal paste and I used the grizzly expensive one. Now junction is up to 15-20 degrees Celsius lower. The guy who helped me told me the screwS on the cooler weren't tight enough and that it was too much thermal paste ALL AROUND and low quality. I tried a timespy and I scored 20700 with mpt up to 375 w, 2555-2655, 2138 memory... Now clocks sit comfortably mid 2650s with max 90 degrees junction. AMAZING!


----------



## Felix123BU (May 9, 2021)

xenosys said:


> I've been experiencing issues with performance in benchmarks and games since the early Feb driver release.  I only updated recently as the latest driver supposedly gives Resident Evil 8 a nice performance boost on the 6800XT's, but once completed I may switch back to that driver release as it seemed to be provide slightly higher clocks and better overall performance.


I noticed that with the last 2 drivers the GPU is sucking less voltage, whereas until 21.3.1 it used to apply a bit more voltage, with ones after 21.3.1 the driver tries to apply less voltage. The difference between 21.3.1 and 21.3.4 is minor in performance, with 21.3.1 maybe having 1% better perf in some rare situations. If I where to make a joke, with 21.3.1 they increased performance to much and then gradually toned it down, to much perf at once wont leave anything for future drivers


----------



## Raendor (May 10, 2021)

xenosys said:


> I've been experiencing issues with performance in benchmarks and games since the early Feb driver release.  I only updated recently as the latest driver supposedly gives Resident Evil 8 a nice performance boost on the 6800XT's, but once completed I may switch back to that driver release as it seemed to be provide slightly higher clocks and better overall performance.


Ha, and I’ve been bombarded on Reddit with messages about how great amd rivers are since vega days and everything is perfect in all games despite known issues with CryEngine and other stuff like conflict with Afterburner, Vulkan fps drops and radeon software crapping out while browsing tabs in it.


----------



## GamerGuy (May 10, 2021)

So far, I've used ATi/AMD cards since the 9700 Pro, running both CF and SLi systems along the way. Like nVidia drivers, I can't say anything negative about their driver, other than some not performing well in newer or just released games (though recent experience with AMD driver with newer games have been just fine). Only issue I had was when I'd just received my RX 6900 XT, Metro Exodus (not the PC EE version) would CTD or crash hard with RT enabled, without RT, it ran fine. That was fixed with a later driver release, and thus far, my card has had no issue with games.....no weird clockspeed, no memory stuck at full speed while on desktop, etc.


----------



## dragospetre88 (May 10, 2021)

My best TimeSpy on Red Devil 6900 XT Limited Edition - screenshot attached. MPT: 325-350 W. Clocks - 2565-2665, memory 2138 mhz.


----------



## Felix123BU (May 10, 2021)

Raendor said:


> Ha, and I’ve been bombarded on Reddit with messages about how great amd rivers are since vega days and everything is perfect in all games despite known issues with CryEngine and other stuff like conflict with Afterburner, Vulkan fps drops and radeon software crapping out while browsing tabs in it.


No driver is perfect, whenever you hear someone say AMD has a perfect driver, or Nvidia has a perfect driver, you should know you are talking to a FANBOY  
What I can say is that the latest AMD drivers are better than they used to be 1 year ago, less things causing issues, very very few in fact for me.


----------



## GerKNG (May 10, 2021)

had to repaste my 6900xt Nitro +
hot spot dropped by up to 35°C.
now the card boosts to around 2680 Mhz in games.


----------



## Felix123BU (May 10, 2021)

GerKNG said:


> had to repaste my 6900xt Nitro +
> hot spot dropped by up to 35°C.
> now the card boosts to around 2680 Mhz in games.


Sweet , so the cooler is not absolute garbage, only the stock paste application


----------



## GerKNG (May 10, 2021)

Felix123BU said:


> Sweet , so the cooler is not absolute garbage, only the stock paste application


still not great compared to others but the paste is the worst i've ever seen.


----------



## moproblems99 (May 10, 2021)

Well, aside from the fact that I am unimpressed with EK products (packaging mostly, or lack of), I am thrilled with the performance of the block.  My junction was hitting 110C playing RDR2 in like 10 minutes.  Playing for a quick 15 minutes after the block?  Didn't see junction over 60C.  Damn son.

To add, I don't even have the fourth fan on the 560mm.  Even better, I can still add 1120mm of radiator if I wanted too.  Although I think 560mm will be enough forever.


----------



## Butanding1987 (May 10, 2021)

dragospetre88 said:


> My best TimeSpy on Red Devil 6900 XT Limited Edition - screenshot attached. MPT: 325-350 W. Clocks - 2565-2665, memory 2138 mhz.


Your graphics score seems to be on the low side for an overclocked 6900 XT. I think you can push the core clocks higher, maybe around 2,650 to 2,750. Are you on air?


----------



## Felix123BU (May 10, 2021)

moproblems99 said:


> Well, aside from the fact that I am unimpressed with EK products (packaging mostly, or lack of), I am thrilled with the performance of the block.  My junction was hitting 110C playing RDR2 in like 10 minutes.  Playing for a quick 15 minutes after the block?  Didn't see junction over 60C.  Damn son.
> 
> To add, I don't even have the fourth fan on the 560mm.  Even better, I can still add 1120mm of radiator if I wanted too.  Although I think 560mm will be enough forever.


Yup, a lot of people disregard custom water cooling, but if you want lowest temps and reasonably silent operation, there is no air cooling that can beat a even decent custom loop, let alone the more sophisticated ones that are put together by people who know how to build an efficient loop  

I did not think my 6800XT could be cooled so well by custom loop, until I put the block on and put LM on the chip, for a GPU built with 7nm and running way out of its spec with a lot of extra power going through it, to basically never reach 60c on the hotspot is a small miracle


----------



## moproblems99 (May 10, 2021)

Felix123BU said:


> Yup, a lot of people disregard custom water cooling, but if you want lowest temps and reasonably silent operation, there is no air cooling that can beat a even decent custom loop, let alone the more sophisticated ones that are put together by people who know how to build an efficient loop
> 
> I did not think my 6800XT could be cooled so well by custom loop, until I put the block on and put LM on the chip, for a GPU built with 7nm and running way out of its spec with a lot of extra power going through it, to basically never reach 60c on the hotspot is a small miracle


I have a XSPC Raystorm Pro on my 3900XT, EK Quantum Plexi/Copper, XSPC AX560MM, and an XSPC Res/Pump D5.  Since EK fudged me on the screws for the backplate and it is already mounted, I may leave the plate off.  I have a fan blasting right up the back of the board so I am not sure if the plate is even advantages.  My card is vertical so I am not worried about leakage or flex.


----------



## Butanding1987 (May 10, 2021)

Felix123BU said:


> Yup, a lot of people disregard custom water cooling, but if you want lowest temps and reasonably silent operation, there is no air cooling that can beat a even decent custom loop, let alone the more sophisticated ones that are put together by people who know how to build an efficient loop
> 
> I did not think my 6800XT could be cooled so well by custom loop, until I put the block on and put LM on the chip, for a GPU built with 7nm and running way out of its spec with a lot of extra power going through it, to basically never reach 60c on the hotspot is a small miracle


I was also pleasantly surprised after putting mine under water. My hotspot temp used to hit 114 degrees Celsius when running Port Royal with the stock air cooler. Now it mostly stays in the 60s.


----------



## dragospetre88 (May 10, 2021)

Butanding1987 said:


> Your graphics score seems to be on the low side for an overclocked 6900 XT. I think you can push the core clocks higher, maybe around 2,650 to 2,750. Are you on air?


I am on air, yes.  I cannot go any higher for timespy, else it crashes. What do you mean it's on the low side? I haven't seen many rx 6900 xt posting 21k scores in Timespy on air...

In games I can push to 2625-2725, which results in 2,66-2.67 ghz constant clock speeds. But that's with 325-350W in mpt, which makes my card suck up to 375W. I am afraid to go any higher on voltage, although I have a solid super flower 1000W psu...

But anyway for daily use I undervolt to 1050 mv, 2420-2520, 2138 memory. Junction temp doesn't even go past 80 degrees, while edge temp is low 60s with fan curve to 45%.


----------



## Raendor (May 10, 2021)

Speaking of RDR2, do you guys have weird fps slowdowns with low gou usage in towns in specific places on vulkan? I don’t have them with DX12 at all, but DX12 crashes if afterburner is running, unlike vulkan.


----------



## Felix123BU (May 10, 2021)

Raendor said:


> Speaking of RDR2, do you guys have weird fps slowdowns with low gou usage in towns in specific places on vulkan? I don’t have them with DX12 at all, but DX12 crashes if afterburner is running, unlike vulkan.


I tried to play the game, 20 hours in, bored me to tears, and with the 6000 series DX12 should be used, Vulkan was giving me strong frame fluctuation, DX12 was better performing overall and smooth as butter with very constant framerates.


----------



## moproblems99 (May 10, 2021)

Felix123BU said:


> I tried to play the game, 20 hours in, bored me to tears, and with the 6000 series DX12 should be used, Vulkan was giving me strong frame fluctuation, DX12 was better performing overall and smooth as butter with very constant framerates.


Vulkan was better on my Vega, DX12 is WAAAAAAAAYYY better on my 6900XT.


----------



## GerKNG (May 10, 2021)

moproblems99 said:


> Vulkan was better on my Vega, DX12 is WAAAAAAAAYYY better on my 6900XT.


RDNA2 seems to not like vulkan at all.

even in Rainbow Six Siege i get less FPS (my 3090 gets like 150 FPS more compared to DX)
and it stutters.


----------



## moproblems99 (May 10, 2021)

I have noticed that my 3900X would boost to about 4679mhz across multiple cores but now the highest I have seen is about 4550mhz.  Not all that particularly worried about it but the cpu temp didn't get past 67C.


----------



## Raendor (May 10, 2021)

Ok, you’re putting me to a rest a bit as vulkan was good on my previous 1080, but on 6800xt DX12 works normal, while vilkan drops usage heavily. Although vulkan was always better around loading game zones. Dx12 always stutters while you move into next map section.


----------



## Felix123BU (May 10, 2021)

GerKNG said:


> RDNA2 seems to not like vulkan at all.
> 
> even in Rainbow Six Siege i get less FPS (my 3090 gets like 150 FPS more compared to DX)
> and it stutters.


Vulkan needs a lot more code optimization to reach peak performance, and since RDNA 2 is new and different architecturally than lets say Vega (who loved Vulkan), and those games being rater "old", don't think the developers put too much time into updating for RDNA 2, both Vulkan and DX12 being a low level API's there's so much you can do with drivers   

I am a bit surprised though why RDNA 2 works so well with basically any DX12 game, but not so good with Vulkan.


----------



## Raendor (May 10, 2021)

Felix123BU said:


> Vulkan needs a lot more code optimization to reach peak performance, and since RDNA 2 is new and different architecturally than lets say Vega (who loved Vulkan), and those games being rater "old", don't think the developers put too much time into updating for RDNA 2, both Vulkan and DX12 being a low level API's there's so much you can do with drivers
> 
> I am a bit surprised though why RDNA 2 works so well with basically any DX12 game, but not so good with Vulkan.


I hope it’s still more on the driver side. Technically api shouldn’t affect graphics series due to the way game was developed.


----------



## Felix123BU (May 10, 2021)

Raendor said:


> I hope it’s still more on the driver side. Technically api shouldn’t affect graphics series due to the way game was developed.


Api's can effect a specific series, has happened before, see the Maxwell series on Nvidia's side, they where and still are "awful" at DX12 and Vulkan comparatively, and on AMD's side DX11 was worse, always below the competition no matter how much driver work was done. That does not mean that the 6000 series does not support Vulkan properly, its just that with Vulkan and DX12 the biggest gains come from the developer and the code.

This happened when Vulkan and DX12 came out, the GCN architecture was much better suited architecturally for these api's than Maxwel or Pascal, but Nvidia caught up in hardware functions and support with future gens, and with both being equally potent theoretically, its the developers who have the biggest impact on performance for a specific architecture, since even though both vendors support the api's, the architectures work slightly differently.

I am not worried by default, but if I would be the worrying guy, I would be cool as long as the game supports both Vulkan  and DX12. And anyway, never games will have RDNA 2 at hand, so in future the issues should be less prevalent. I also hoped Vulkan gets more traction, but as it looks right now, its a very niche thing.


----------



## GamerGuy (May 11, 2021)

Good to know, I usually run games on Vulkan (when possible) as I've gotten quite used to thinking it's better based on my experience with my Vega64 rig.  Now, will always default to DX12 for my RX 6900 XT in every game which has both..at least until RDNA2 cards reach the level of performance like it has with the more mature GCN cards when running Vulkan games.

Gonna fire up Metro Exodus PC EE now, near the end game as I'm now playing the Dead City level....


----------



## dragospetre88 (May 11, 2021)

Guys, I saw that many of you have the power limit GPU (W) setting at a lower value than tdc limit GFX (A) in more power tool. Why is that? I thought TDC limit has to be higher... What am I missing here?


----------



## xenosys (May 11, 2021)

dragospetre88 said:


> I am on air, yes.  I cannot go any higher for timespy, else it crashes. What do you mean it's on the low side? I haven't seen many rx 6900 xt posting 21k scores in Timespy on air...
> 
> In games I can push to 2625-2725, which results in 2,66-2.67 ghz constant clock speeds. But that's with 325-350W in mpt, which makes my card suck up to 375W. I am afraid to go any higher on voltage, although I have a solid super flower 1000W psu...
> 
> But anyway for daily use I undervolt to 1050 mv, 2420-2520, 2138 memory. Junction temp doesn't even go past 80 degrees, while edge temp is low 60s with fan curve to 45%.



You're correct.  21K is an excellent score, so you've got a decent card on your hands.  It's well above average on 3D Mark, but probably about average on this particular forum, which is generally full of overclockers.  19k is about average for a 6900XT, according to 3D Mark.

If you get that under water, you can probably push 21.5k if at any point you're inclined with load temps being a lot lower.


----------



## turbogear (May 11, 2021)

xenosys said:


> You're correct.  21K is an excellent score, so you've got a decent card on your hands.  It's well above average on 3D Mark, but probably about average on this particular forum, which is generally full of overclockers.  19k is about average for a 6900XT, according to 3D Mark.
> 
> If you get that under water, you can probably push 21.5k if at any point you're inclined with load temps being a lot lower.


The Hardwareluxx has collected statistics that show performance distribution of different RDNA2 cards:









						[User-Review] - Statistik-Analyse: So schlägt sich Navi 21 in 3DMark Time Spy
					

Statistik-Analyse: So schlägt sich Navi 21 in 3DMark Time Spy Drei Enthusiasten haben Zeit und Hirnschmalz investiert, um öffentlich einsehbare Daten zur Radeon RX 6800, 6800 XT und 6900 XT im Grafik-Benchmark Time Spy unter die Lupe zu nehmen. Dies ist das Ergebnis.  Inhaltsverzeichnis: Erste...




					www.hardwareluxx.de


----------



## Felix123BU (May 11, 2021)

turbogear said:


> The Hardwareluxx has collected statistics that show performance distribution of different RDNA2 cards:
> 
> 
> 
> ...


According to that forum, highly interesting charts, my 6800XT is godly, mine can reach 2670 Mhz average real frequency in gaming   that is off the charts of that chart 

This was only my second reference card after the RX 480, all other where custom and not one was such a good chip as this 6800XT  20600 in Timespy with a 6800XT is what it can do with a 100% stable OC.

Interestingly enough, driver 21.3.1 is still the highest performer for the 6800XT and 6900XT in those readings also.
Also interesting is that according to that chart, the batch of reference cards are between the best performers for the 6800XT and 6900XT. (though that might be due to those being more often water-blocked)



dragospetre88 said:


> Guys, I saw that many of you have the power limit GPU (W) setting at a lower value than tdc limit GFX (A) in more power tool. Why is that? I thought TDC limit has to be higher... What am I missing here?


I think I am missing what you want to ask, first you ask why folks here have the GPU (W) setting lower than GFX (A)(which is TDC), then toy say TDC has to be higher, which you already asked in the first part of the question, so you are asking why TDC is higher here, when it should be higher, as you also state?


----------



## dragospetre88 (May 11, 2021)

8





Felix123BU said:


> According to that forum, highly interesting charts, my 6800XT is godly, mine can reach 2670 Mhz average real frequency in gaming   that is off the charts of that chart
> 
> This was only my second reference card after the RX 480, all other where custom and not one was such a good chip as this 6800XT  20600 in Timespy with a 6800XT is what it can do with a 100% stable OC.
> 
> ...


Yeah, my mistake... Basically I wanted to know why People have power limit higher than tdc limit. I always have tdc higher when using mpt.


----------



## turbogear (May 11, 2021)

Felix123BU said:


> According to that forum, highly interesting charts, my 6800XT is godly, mine can reach 2670 Mhz average real frequency in gaming   that is off the charts of that chart
> 
> This was only my second reference card after the RX 480, all other where custom and not one was such a good chip as this 6800XT  20600 in Timespy with a 6800XT is what it can do with a 100% stable OC.
> 
> ...


Yes mine 6800XT (R.I.P. edition)  was also Godly.
Mine was running all time at 2700MHz in games.

Maybe our theory of first batches of reference cards being all Gold samples is true.


----------



## Felix123BU (May 11, 2021)

dragospetre88 said:


> 8
> Yeah, my mistake... Basically I wanted to know why People have power limit higher than tdc limit. I always have tdc higher when using mpt.


Basically to have more wattage available for OC's. though that can be dangerous for the chip and the power circuitry and not recommended (except if maybe you only increase Power Limit GPU and not TDC). Generally if you increase limits you should do it linearly, that would always leave your max TDC  higher than max GPU power limit for Navi 2.



turbogear said:


> Yes mine 6800XT (R.I.P. edition)  was also Godly.
> Mine was running all time at 2700MHz in games.
> 
> Maybe our theory of first batches of reference cards being all Gold samples is true.


"Maybe our theory of first batches of reference cards being all Gold samples is true . " --- could very well be true, first batch of GPUs where all great OC-er, have not heard about crappy first batch 6000 series. It would be a new marketing style for the new Lisa Su AMD


----------



## HaggtGT (May 11, 2021)

Felix123BU said:


> Basically to have more wattage available for OC's. though that can be dangerous for the chip and the power circuitry and not recommended (except if maybe you only increase Power Limit GPU and not TDC). Generally if you increase limits you should do it linearly, that would always leave your max TDC  higher than max GPU power limit for Navi 2.
> 
> 
> "Maybe our theory of first batches of reference cards being all Gold samples is true . " --- could very well be true, first batch of GPUs where all great OC-er, have not heard about crappy first batch 6000 series. It would be a new marketing style for the new Lisa Su AMD


My 6800 XT Red Devil stock bios has a really low default TDC? You said to leave it higher than your limit but this card is way below it never mind way above it. Do you know why?


----------



## xenosys (May 11, 2021)

turbogear said:


> Yes mine 6800XT (R.I.P. edition)  was also Godly.
> Mine was running all time at 2700MHz in games.
> 
> Maybe our theory of first batches of reference cards being all Gold samples is true.



Well I also have a Powercolor Reference 6800XT and was hitting 20950 in Timespy a couple of months ago, so you may be right.


----------



## Felix123BU (May 11, 2021)

HaggtGT said:


> My 6800 XT Red Devil stock bios has a really low default TDC? You said to leave it higher than your limit but this card is way below it never mind way above it. Do you know why?


Not sure (maybe the power delivery cooling is not the best and they toned it down). Here is my default stock MPT readings





Looking at this (1) GPU PCB Breakdown: RX 6800XT Red Devil - YouTube
I can say your TDC is very low, my reference board has a smidge weaker PCB design as far as power delivery goes, and since your cards can theoretically handle much more, if I where you, I would put 300 A (if it does not blow up, up to 330) TDC Limit GFX and 300 W Power Limit GPU, that should be in theory safe enough, of course if the card does not get out of hand with temperatures. But again, I am not telling you to do this, its only what I would do if I had that card, and if you do it, I take no responsibility for what might happen if you do it


----------



## turbogear (May 11, 2021)

HaggtGT said:


> My 6800 XT Red Devil stock bios has a really low default TDC? You said to leave it higher than your limit but this card is way below it never mind way above it. Do you know why?


Yes on Red Devil it is opposite to reference card.
If I think on reference it was 255W and 300A TDC by default.

I did not also understand why the Devil is doing opposite to reference design.
Maybe it depends on chip quality?

Basically if I understand correctly one of the above mentioned values is the Core Power target and the other is Maximum Current target of core.
Power is related to voltage and current:
Power=Voltage x Current

So if we have same power target on reference and Devil, then on reference lower voltage and higher current will produce this power target.
On Devil it would be opposite higher voltage and lower current give the power target.

Why is it set like this on some of non reference cards?
That is a question I don't know the answer for.


----------



## Felix123BU (May 11, 2021)

xenosys said:


> Well I also have a Powercolor Reference 6800XT and was hitting 20950 in Timespy a couple of months ago, so you may be right.


Yup, mine is also a Powercolor Reference 6800XT


----------



## dragospetre88 (May 11, 2021)

turbogear said:


> Yes on Red Devil it is opposite to reference card.
> If I think on reference it was 255W and 300A TDC by default.
> 
> I did not also understand why the Devil is doing opposite to reference design.
> ...


My default vbios for red devil rx 6900 xt has 281 W power limit  and 320 tdc limit gfx. When i use more power I tend to go for 325 W power limit and 345-350 tdc limit gfx. Is this ok? From what I understand from you it is not...


----------



## turbogear (May 11, 2021)

dragospetre88 said:


> My default vbios for red devil rx 6900 xt has 281 W power limit  and 320 tdc limit gfx. When i use more power I tend to go for 325 W power limit and 345-350 tdc limit gfx. Is this ok? From what I understand from you it is not...


I did not say if it is not okay.
It was only my observation that the reference and Red Devil 6800XT are configured differently.
I was trying to find the reason behind it.
I am also trying to understand why some cards are configured differently.

Below is the default setting on my 6900XTU:
The total Power Limit is 382W for me with 15% from Radeon Software.
i.e. 332x1.15=382W.

TDC set by PowerColor is 320A. Again in this case TDC is set with value of 62 below the max Power limit.

I have currently TDC at 340A in MPT. I will try increasing it to 360A.
This would still be below the 382W Power Limit.


----------



## Felix123BU (May 11, 2021)

dragospetre88 said:


> My default vbios for red devil rx 6900 xt has 281 W power limit  and 320 tdc limit gfx. When i use more power I tend to go for 325 W power limit and 345-350 tdc limit gfx. Is this ok? From what I understand from you it is not...


As long as you do not get stability issues and to high temps, its perfect so to say. As turbogear said, seems some cards are configured a bit differently (red devils in particular), for reasons unknown to us  
But as long as you get the extra performance, and are not more than +15% in MPT vs stock settings (10% plus in MPT I would consider conservative), that would give you an additional 30% extra power overall, and as long as you don't notice instability or abnormally high temps on any of the CPU, RAM, VRM etc, you are golden


----------



## dragospetre88 (May 12, 2021)

Felix123BU said:


> As long as you do not get stability issues and to high temps, its perfect so to say. As turbogear said, seems some cards are configured a bit differently (red devils in particular), for reasons unknown to us
> But as long as you get the extra performance, and are not more than +15% in MPT vs stock settings (10% plus in MPT I would consider conservative), that would give you an additional 30% extra power overall, and as long as you don't notice instability or abnormally high temps on any of the CPU, RAM, VRM etc, you are golden


Junction temp does not GO above 95 degrees C and the performance is up roughly by 8-10% compare to stock settings. Pretty good I would say.


----------



## Felix123BU (May 12, 2021)

dragospetre88 said:


> Junction temp does not GO above 95 degrees C and the performance is up roughly by 8-10% compare to stock settings. Pretty good I would say.


Yup, a 10% overclock is not bad at all, quite good actually    Foarte bine


----------



## dragospetre88 (May 12, 2021)

Felix123BU said:


> Yup, a 10% overclock is not bad at all, quite good actually    Foarte bine


Just did some fresh testing at 4k - default preset vs oc (SAM enabled with both). Horizon Zero Dawn hit 84 fps on average with my oc, up from 78. Valhalla reached 80 fps up from 72, and metro exodus enhanced reached 78 from 71 (just 1440p, high Ray tracing, ultra preset - The game lags too much at 4k). 1% lows and 0.1% lows were significantly higher too.


----------



## moproblems99 (May 12, 2021)

Raendor said:


> Ok, you’re putting me to a rest a bit as vulkan was good on my previous 1080, but on 6800xt DX12 works normal, while vilkan drops usage heavily. Although vulkan was always better around loading game zones. Dx12 always stutters while you move into next map section.


So, after having switched back to DX12, I started getting crashes again about every 15 minutes of gameplay.  This was the original reason I switched to Vulkan in the first place.  The good news, is that after terrible initial performance with Vulkan, I have switched back and it appears to be running flawlessly again just like with my Vega.  Framerates are a little better too.


----------



## Raendor (May 12, 2021)

moproblems99 said:


> So, after having switched back to DX12, I started getting crashes again about every 15 minutes of gameplay.  This was the original reason I switched to Vulkan in the first place.  The good news, is that after terrible initial performance with Vulkan, I have switched back and it appears to be running flawlessly again just like with my Vega.  Framerates are a little better too.


DOesn't work for me. Every time I switch to Vulkan I have fps drops with drops in gpu usage (down to 50% in specific area) which are not present in DX12 (but I can't use AB with it for monitoring). Btw if you switch - delete sga and temp files in Settings folder.


----------



## moproblems99 (May 12, 2021)

Raendor said:


> DOesn't work for me. Every time I switch to Vulkan I have fps drops with drops in gpu usage (down to 50% in specific area) which are not present in DX12 (but I can't use AB with it for monitoring). Btw if you switch - delete sga and temp files in Settings folder.


Interesting.  I did not delete those.  After switching, my fps went from mid-70s on DX12 to low 80's on Vulkan.  It's a little more than margin of error so I won't say anything was from the switch.  I have had notorious problems with RDR2.  I couldn't play online for 4 months after it was released.  When I could get online, I would crash instantly or my camp wouldn't spawn.



Raendor said:


> DOesn't work for me. Every time I switch to Vulkan I have fps drops with drops in gpu usage (down to 50% in specific area) which are not present in DX12 (but I can't use AB with it for monitoring). Btw if you switch - delete sga and temp files in Settings folder.


 Also, if all you are using is afterburner for monitoring, I switched to HwInfo.  It uses RTSS just like afterburner but you can add way more metrics.


----------



## Raendor (May 12, 2021)

Well I'll be damned. I just tried to disable resizable bar and RDR2 Vulkan now didn't experience the drops I had before. WTF, I thought it was all fixed by now. Wonder if it's my asus b560-i to blame (latest bios is installed) or amd driver doesn't properly work with Rocket lake cpus. But at the same time DX12 was totally fine.

I’ll be damned twice. I turned it back on and had no drops on Vulkan now when tried again. BIOS bug? who knows. I wonder now if I can run afterburner without issues and whether my undervolting could be further improved. Who knows how that was affecting stability before.


----------



## chispy (May 12, 2021)

And ...  we got a couple of new World Records today with Powercolor Red Devil Ultimate 6900xtx-h clock speed at 3306Mhz ~ 3274Mhz ( highest core clock in the world for a gpu benchmark stable ) and highest ever 3dmark Firestrike Extreme score. It was done by none the less the good guys OGS from Greece , congratulations on such a huge achievement.  Absouletely Massive score

https://hwbot.org/submission/4745694_ogs_3dmark___fire_strike_extreme_radeon_rx_6900_xt_35869_marks


----------



## Felix123BU (May 12, 2021)

chispy said:


> And ...  we got a couple of new World Records today with Powercolor Red Devil Ultimate 6900xtx-h clock speed at 3306Mhz ~ 3274Mhz ( highest core clock in the world for a gpu benchmark stable ) and highest ever 3dmark Firestrike Extreme score. It was done by none the less the good guys OGS from Greece , congratulations on such a huge achievement.  Absouletely Massive score
> 
> https://hwbot.org/submission/4745694_ogs_3dmark___fire_strike_extreme_radeon_rx_6900_xt_35869_marks


One can only wonder why AMD choose to limit the max clocks probably knowing very well how great the 6000 chips are for overclocking  
AMD should acknowledge this mistake and release some unlocked bioses for all cards, that would spark a overclocking craze and would only be good free publicity for them. But  well, AMD and marketing.......


----------



## Butanding1987 (May 12, 2021)

Felix123BU said:


> One can only wonder why AMD choose to limit the max clocks probably knowing very well how great the 6000 chips are for overclocking
> AMD should acknowledge this mistake and release some unlocked bioses for all cards, that would spark a overclocking craze and would only be good free publicity for them. But  well, AMD and marketing.......


They will, once the 3080Ti comes out.


----------



## Felix123BU (May 13, 2021)

Mmmmm, beat my all time TimeSpy score by quite some points  





Almost 21k with a 6800XT, not bad 

And Firestrike also went insane


----------



## Butanding1987 (May 13, 2021)

Felix123BU said:


> Mmmmm, beat my all time TimeSpy score by quite some points
> 
> 
> 
> ...


What changed? RAM?


----------



## Felix123BU (May 13, 2021)

Butanding1987 said:


> What changed? RAM?


Mainly I think the Liquid Metal on the GPU, last test where with "normal" paste, also a fresh Windows install done yesterday, not sure how much that impacted it.
Another thing worth mentioning, my Win 10 was 2 years "old" and was running the 1usmus CPU powerplan, that was ok for my previous 3800X, but I think hampered the 5800X a bit. I get higher FPS in everything without that powerplan and simply using the windows high perf plan.
Vs the last scores, also managed to up memory from 3733 to 3800mhz, but don't think that influenced it that much.
GPU memory for the above scores was at 2100, I think I can fine tune it a bit more, but its just a benchmark, not that interested in wasting more time on it.
What I find funny in a way, checked 3DMark hall of fame, my Firestrike Graphics score is I think the second best in that list for a 6800XT, but might be wrong


----------



## Butanding1987 (May 14, 2021)

Felix123BU said:


> Mainly I think the Liquid Metal on the GPU, last test where with "normal" paste, also a fresh Windows install done yesterday, not sure how much that impacted it.
> Another thing worth mentioning, my Win 10 was 2 years "old" and was running the 1usmus CPU powerplan, that was ok for my previous 3800X, but I think hampered the 5800X a bit. I get higher FPS in everything without that powerplan and simply using the windows high perf plan.
> Vs the last scores, also managed to up memory from 3733 to 3800mhz, but don't think that influenced it that much.
> GPU memory for the above scores was at 2100, I think I can fine tune it a bit more, but its just a benchmark, not that interested in wasting more time on it.
> What I find funny in a way, checked 3DMark hall of fame, my Firestrike Graphics score is I think the second best in that list for a 6800XT, but might be wrong


In my case, my scores are always pulled down by the combined test in Fire Strike. Oddly, I always end up with lower combined test scores even if I get high graphics and CPU scores separately. Your graphics score should be #77.
Fire Strike



Fire Strike Extreme


----------



## Felix123BU (May 14, 2021)

Butanding1987 said:


> In my case, my scores are always pulled down by the combined test in Fire Strike. Oddly, I always end up with lower combined test scores even if I get high graphics and CPU scores separately. Your graphics score should be #77.
> Fire Strike
> View attachment 200249
> Fire Strike Extreme
> View attachment 200250


Is there a way to sort Firestrike scores based on Graphics Results? That's why I said no.2, had no real way to sort them based on Graphics only


----------



## Butanding1987 (May 14, 2021)

Felix123BU said:


> Is there a way to sort Firestrike scores based on Graphics Results? That's why I said no.2, had no real way to sort them based on Graphics only


You can. Click “Results,” not “ Hall of Fame”.


----------



## Felix123BU (May 14, 2021)

Cool, thx, not a 3D Mark expert 
According to that list for 6800XT's in Firestrike Graphic Score only, my latest score is on number 21  (60370) ...that is if I am not reading/sorting the list wrongly again


----------



## Space Lynx (May 14, 2021)

I really love my rx 6800, still wish I could have got the XT though. I have had 0 issues with mine, a couple games like WoW seem to get error messages after 1-2 hours of gameplay... but with Re-BAR turned off, those go away... honestly wish they never did SAM... it was a headache figuring out that was the problem. I miss consoles for this reason alone.


----------



## Felix123BU (May 14, 2021)

lynx29 said:


> I really love my rx 6800, still wish I could have got the XT though. I have had 0 issues with mine, a couple games like WoW seem to get error messages after 1-2 hours of gameplay... but with Re-BAR turned off, those go away... honestly wish they never did SAM... it was a headache figuring out that was the problem. I miss consoles for this reason alone.


Have not had crashing issues with ReBar, only some few games who had minor negative scaling with it on. It's a "new" feature, give it some time, should get better and better for all of us with time.
Feel the same about my 6800 XT, have not loved a GPU that much since the HD 4870 back in the day


----------



## Space Lynx (May 14, 2021)

Felix123BU said:


> Have not had crashing issues with ReBar, only some few games who had minor negative scaling with it on. It's a "new" feature, give it some time, should get better and better for all of us with time.
> Feel the same about my 6800 XT, have not loved a GPU that much since the HD 4870 back in the day



I only had crashing issue in WoW, every other game (many others in fact as I hope around games a lot) was just fine. WoW is just finnicky I guess.


----------



## Raendor (May 14, 2021)

lynx29 said:


> I really love my rx 6800, still wish I could have got the XT though. I have had 0 issues with mine, a couple games like WoW seem to get error messages after 1-2 hours of gameplay... but with Re-BAR turned off, those go away... honestly wish they never did SAM... it was a headache figuring out that was the problem. I miss consoles for this reason alone.


Ah man, I get you. I seem to have problem with rebar on my system too. It leads to gpu utilization drops in rdr2, which either disappear if I turn it off, or if I turn off, restart, turn on - play. But on the next system startup the drops will get back and I have to repeat process. Plus I’m nit sure if some crashes I had aren’t caused by it as well. I run latest bios, tried various setting, drivers even did a full windows reinstall now, but all roads lead to rebar. Makes me want to just give up, sell it and use my Series X going forward for gaming.


----------



## GamerGuy (May 14, 2021)

I was among the early ones with SAM enabled (with a Zen 2 CPU), and it has been issue free since day 1, other than negative scaling in some games, but the cards are so fast at rasterized games that negative scaling is not noticeable. I understand those with RX 6800 cards wanting the RX 6800 XT, I was the same when I'd gotten an Nitro+ RX 6800, but wanted more POWAH! 

Was waiting for the MERC319 RX 6900 XT when there was an issue with my CC (otherwise, I'd be happier with the MERC319, NOT that it's a better card than my Nitro+, it's just that the former would have saved me 140USD), but I have no regrets as the Nitro+ is a mighty fine card (just wished it looked different from the Nitro+ RX 6800 which it replaced, It's like I'd not changed anything physically as they looked so similar).


----------



## HaggtGT (May 14, 2021)

Does anyone know the thickness of the thermal pads used on the 6800 XT Red Devil? I think they are 1mm and 1.5mm but not 100% sure.


----------



## Felix123BU (May 15, 2021)

YES! No.3 on the leaderboard in Firestrike with a 6800XT in Graphics Score and third best out of all 6800XT Graphics Score scores 

61785 Graphics Score
AMD Radeon RX 6800 XT video card benchmark result - AMD Ryzen 7 5800X,Gigabyte Technology Co., Ltd. B550 AORUS PRO (3dmark.com)




This shit is addictive


----------



## Hemmingstamp (May 15, 2021)

Felix123BU said:


> This shit is addictive


But can It play Roblox @200FPS?


----------



## Felix123BU (May 15, 2021)

Hemmingstamp said:


> But can It play Roblox @200FPS?


No idea, but CAN IT RUN CRYSIS?


----------



## Hemmingstamp (May 15, 2021)

Therefore the answer is NO....
I cancelled my 6900 XT because of this 

(The truth is I cancelled it because I paid scalpers price and thought fuggum. I'm not being ripped off)



Felix123BU said:


> No idea, but CAN IT RUN CRYSIS?


I heard the 6800 can do at least 300FPS. 720 Ultra low settings that is...


----------



## Felix123BU (May 15, 2021)

Hemmingstamp said:


> Therefore the answer is NO....
> I cancelled my 6900 XT because of this
> 
> (The truth is I cancelled it because I paid scalpers price and though fuggum. I'm not being ripped off)


Wholeheartedly agree, I would also never pay scalper prices, the moment you do, you give scalpers the ok to continue and present your behind for their pleasure


----------



## Hemmingstamp (May 15, 2021)

Felix123BU said:


> Wholeheartedly agree, I would also never pay scalper prices, the moment you do, you give scalpers the ok to continue and present your behind for their pleasure


That was from an online retail store. Never going to shop with them again. Rip off bustards.


----------



## GamerGuy (May 15, 2021)

Yeah, but not only online, brick and mortar shops (in the local tech mall in my neck of the woods) are selling at crazy high prices too. They've effectively become scalpers themselves. I went to just look around a while back, saw that some cards seemed reasonably priced, but when I'd enquired, I was told price listed was the bundled/system build price, for the card alone, the price left me slack jawed.

Just really glad I'd jumped on my card late December/early January, 2021, and even then, I'd fully believed that prices would go down AFTER I'd gotten the card. I'd bought a GF 7950 GX2, and was dumbfounded when the GF 8800 series was released within weeks of my purchase (same goes for my GF4 Ti 4200, the 9700 Pro came out and spanked the entire GF4 Ti series silly), or maybe I simply have bad luck with nVidia cards.


----------



## Butanding1987 (May 15, 2021)

Felix123BU said:


> This shit is addictive


Congrats! Yes, it is. You won’t have time anymore to play games. Now go get a 5950x to boost your CPU and overall scores.


----------



## Felix123BU (May 15, 2021)

Butanding1987 said:


> Congrats! Yes, it is. You won’t have time anymore to play games. Now go get a 5950x to boost your CPU and overall scores.


Ain't that bad   , took me only 3 hours to find the max out of the card (and there might be still something to squeeze out of it), and I can say one thing, without Liquid Metal I could have not pushed it this far, of that I am sure. Re getting a 5950X, nah, the same I said no way in hell will I pay 25 EUR for freaking 3DMark, same I wont pay 750+ for a thing I don't need for a better bench score which besides being fun, is completely irrelevant


----------



## GamerGuy (May 15, 2021)

Felix123BU said:


> Re getting a 5950X, nah, the same I said no way in hell will I pay 25 EUR for freaking 3DMark, same I wont pay 750+ for a thing I don't need for a better bench score which besides being fun, is completely irrelevant


Same here, I was thinking of getting a 5950X (or at least a 5900X) but decided against it as I really don't need it.......yet. I may look at a used or NOS 5950X in 2-3 years time, just to give my rig a little boost in gaming performance. My 3900X is more than enough for ALL the games I play as of now, I may not get the highest framerate with another system with a secksay Ryzen 5000 series CPU, but the 3900X is fast enough to gimme good framerate in games, which is my primary focus.

Reading about your OC'ing experience takes me back to the good old days of my AMD AXP 3200+ (Barton) + ATi 9700 Pro, I was part of a 3DMark 01 team and was trying to be the 3rd to break 20k, man, I sweated bullets trying for 20k, usually in the high 19k (after tweaking and OC'ing)....but I did do it, and it felt great!


----------



## turbogear (May 15, 2021)

Felix123BU said:


> Mmmmm, beat my all time TimeSpy score by quite some points
> 
> View attachment 200220
> 
> ...


Damn I don't visit the thread a few days and @Felix123BU start breaking records.  
That's amazing score.

I haven't tried to check the limits of 6900XTU that much but I can say the LM is tempting. 
At 2750MHz the performance does not scale that much as temperature goes fast towards 90°C then giving more power brings little gain.
It seems that I need to be brave again and screw my warranty and open the expensive card.


----------



## Felix123BU (May 15, 2021)

turbogear said:


> Damn I don't visit the thread a few days and @Felix123BU start breaking records.
> That's amazing score.
> 
> I haven't tried to check the limits of 6900XTU that much but I can say the LM is tempting.
> ...


LM is pure magic on my chip, and I bet it would drastically change temps on your 6900XT too, see how I am tempting you?  
But if you do it, pretty please be careful, the though of wrecking a 2000+ EUR GPU gives me shivers 

Also, if you referred to the old score of 60370 in Firstrike, that's ok, the new best is 61785 and that's a fantastic score, did not think I could do such a high score with my puny reference 6800XT


----------



## turbogear (May 15, 2021)

I was talking to owner of 6900XTU that scored 2300 Time Spy.
He wa saying the same that until he went to LM he was not able to get higher settings stable.
That guy has been tracking 6900XTU owners on multiple forums and was saying that PowerColorhas used crappy thermal past on these cards. 
The pads on the other hand are very high quality.


----------



## Felix123BU (May 15, 2021)

turbogear said:


> I was talking to owner of 6900XTU that scored 2300 Time Spy.
> He wa saying the same that until he went to LM he was not able to get higher settings stable.
> That guy has been tracking 6900XTU owners on multiple forums and was saying that PowerColorhas used crappy thermal past on these cards.
> The pads on the other hand are very high quality.


Well yeah, 6900XTU chips are overclocker cards, one can not leave the factory thermal paste on, that would be unworthy of the chip


----------



## turbogear (May 15, 2021)

Felix123BU said:


> LM is pure magic on my chip, and I bet it would drastically change temps on your 6900XT too, see how I am tempting you?
> But if you do it, pretty please be careful, the though of wrecking a 2000+ EUR GPU gives me shivers


It seems we live on the limit and something doesn't stop us from taking risks. 
Anyways I was not thinking about computer and enjoy some time with family last days as we had holidays here.
My son and me broke also a or own record by driving 64km bike in 3 hours yesterday.


----------



## Felix123BU (May 15, 2021)

turbogear said:


> It seems we live on the limit and something doesn't stop us from taking risks.
> Anyways I was not thinking about computer and enjoy some time with family last days as we had holidays here.
> My son and me broke also a my own record by driving 64km bike in 3 hours yesterday.


If you are a PC enthusiast, its part of the job and the fun, not so fun is when your shiny new card fries because you shorted it with LM, so please be careful with that half a kidney valued 6900XTU


----------



## turbogear (May 15, 2021)

Felix123BU said:


> If you are a PC enthusiast, its part of the job and the fun, not so fun is when your shiny new card fries because you shorted it with LM, so please be careful with that half a kidney valued 6900XTU


I applied LM on Vega before and delided i7 7700k and applied it there.
I did not use much protection there.  
Both survived. i7 is still being used by my son.
Due to Murphis law, I will try to be careful this time and apply some protection.

I wonder if there is expiry date for Liquid Metal.  
I have a one unopened package of Cool Laboratory Liquid Pro that I bought about 3 years ago.
I have also Thermal Grizzly Conductonaut package where I used some parts on Radeon VII about 2 years ago.
I wonder if these can be still used.

I was watching video about the liquid metal and it's use with different types of metal coolers.
So LM does stain Nickel plated surfaces.
I had this experience as well before with the CPU where the text on the surface of Nickel plated Heat Spreader was gone after a year.

Here is video from Gamers Nexus comparing the effects of LM on the surface of Copper, Nickel and Aluminum:
How Liquid Metal Affects Copper, Nickel, and Aluminum (Corrosion Test) - YouTube

So if I don't want to screw the warranty on the 2299€ card, then maybe I should stay away from LM.  
For warranty purposes it would stain the Nickel in a way that it is not reversible. 
So even if I remove the warranty sticker in a super careful way without breaking it, the warranty sticker will not help as the cooler will be stained after using LM for some time. 

But maybe I still should open it and apply some better thermal paste like Thermal Grizzly. 

I was also looking for the best alternatives to LM.
The Thermal Grizzly Kryonaut is still the best according to the recent review from Guru3D and better than MX-5.
According to this review LM (Thermal Grizzly Conductonaut ) is only few degrees better than Kryonaut but according to experience from @Felix123BU and @Butanding1987 it is much better than just a few degrees: 

Guru3D Thermal Paste Roundup - Round 2 (2021) - Introduction


----------



## aRcane305 (May 15, 2021)

turbogear said:


> I applied LM on Vega before and delided i7 7700k and applied it there.
> I did not use much protection there.
> Both survived. i7 is still being used by my son.
> Due to Murphis law, I will try to be careful this time and apply some protection.
> ...


Hey turbo, remember I came and asked for help regarding overclocking the 6900XT toxic. I managed to get 2735Mhz clocks and 2150 vram stable in synthetic benchmarks, using Afterburner to set the clocks and MPT and wattman for power. is that a normal thing to do without modding it?


----------



## Butanding1987 (May 16, 2021)

aRcane305 said:


> Hey turbo, remember I came and asked for help regarding overclocking the 6900XT toxic. I managed to get 2735Mhz clocks and 2150 vram stable in synthetic benchmarks, using Afterburner to set the clocks and MPT and wattman for power. is that a normal thing to do without modding it?


Make sure your overclocked VRAM is actually giving you better performance. Sometimes, it may be stable but it actually leads to worse performance. You’ll never know until you do benchmark comparisons.


----------



## GerKNG (May 16, 2021)

an example from me
Stock undervolted vs. timespy extreme loop, over 100 Hours of Gaming and other stresstests stable 2730 Mhz at 1150mv.






and another benchmark with powerlimits (same 2730Mhz Core 2100Mhz "Fast" Memory OC)


----------



## turbogear (May 16, 2021)

aRcane305 said:


> Hey turbo, remember I came and asked for help regarding overclocking the 6900XT toxic. I managed to get 2735Mhz clocks and 2150 vram stable in synthetic benchmarks, using Afterburner to set the clocks and MPT and wattman for power. is that a normal thing to do without modding it?


That looks good. Only thing is that check if 2150MHz on VRAM gives you higher performance or something less.
On many 6800XT and 6900XT cards actually score degrades if memory is pushed to the limit.
Many times less than 2130MHz gives actaully higher performance.


----------



## kak4rot (May 16, 2021)

Hi all,











This is the shitty max clock I get on my watercooled reference AMD RX 6900 XT when I press manual setting in the perfomance tab in wattman.






It looks the max stable OC I can reach is 2539 Mhz with 15% power limit and 1175 mV. Increasing power limit with MPT doesn't change anything on my max clock, if I increase max clock it becomes unstable on TimeSpy.






Can you say that I got the most shitty reference card ever? what a pity that the waterblock ended up on this card :cautious:

Other thing I have noticed is I have random micro stutter accompanied by a small drop in fps in some situations in Battlefield V and Quake Champions...


----------



## turbogear (May 16, 2021)

kak4rot said:


> Hi all,
> 
> 
> 
> ...


Well you are just unlucky it seems. 
The 6800XT Red Devil that I bought to replace the my defective reference 6800XT cards was also really shitty. It did not go higher than 2400MHz and stuttered at that settings in games.
I returned it and got myself 6900XTU Liquid Devil.
This can run 2750MHz at 1750mV (max 1200mV possible) but it is not as energy efficient.

The best was my 6800XT reference card that Oced to 2750MHz at 1020mV while consuming only 345W.
It is pity that it was destroyed by ESD shock.


----------



## Felix123BU (May 16, 2021)

turbogear said:


> Well you are just unlucky it seems.
> The 6800XT Red Devil that I bought to replace the my defective reference 6800XT cards was also really shitty. It did not go higher than 2400MHz and stuttered at that settings in games.
> I returned it and got myself 6900XTU Liquid Devil.
> This can run 2750MHz at 1750mV (max 1200mV possible) but it is not as energy efficient.
> ...


There should be a warning on all new 6000 chips, might not overclock as well as the first batch


----------



## kak4rot (May 16, 2021)

turbogear said:


> Well you are just unlucky it seems.
> The 6800XT Red Devil that I bought to replace the my defective reference 6800XT cards was also really shitty. It did not go higher than 2400MHz and stuttered at that settings in games.
> I returned it and got myself 6900XTU Liquid Devil.
> This can run 2750MHz at 1750mV (max 1200mV possible) but it is not as energy efficient.
> ...


My stutters are not related with OC, even on stock setting I notice some microstutter. I have tried everything I have read here, reddit, amd forums, overclock.net. etc. I wonder if I should RMA this garbage


----------



## Felix123BU (May 16, 2021)

kak4rot said:


> My stutters are not related with OC, even on stock setting I notice some microstutter. I have tried everything I have read here, reddit, amd forums, overclock.net. etc. I wonder if I should RMA this garbage


Did you get it from AMD directly? If yes, you should RMA it.


----------



## turbogear (May 16, 2021)

kak4rot said:


> My stutters are not related with OC, even on stock setting I notice some microstutter. I have tried everything I have read here, reddit, amd forums, overclock.net. etc. I wonder if I should RMA this garbage


Yes I think you should RMA it if it stutters all time, but I am not sure if your warranty is still valid as you have opened it. 

The stutter is very curious issue. My reference 6800XT also started stuttering but for me it happened after ESD issue.
My 6800XT card has been in repair over a month now. 
I would be lucky if they repair it as I had it also under water block.
I took off warranty stickers in good state when I opened it. Still it would be possible to see that cooler was taken off  if they look closely.

So let's see if at some point I get a replacement or they will send it back without repair.

In anyways I think it was good that I bought a replacement otherwise I would not have been able to use my computer for very long time.
I still don't have any definite time frame until when I get it back.


----------



## kak4rot (May 16, 2021)

turbogear said:


> Yes I think you should RMA it if it stutters all time, but I am not sure if your warranty is still valid as you have opened it.
> 
> The stutter is very curious issue. My reference 6800XT also started stuttering but for me it happened after ESD issue.
> My 6800XT card has been in repair over a month now.
> ...


I wasn't lucky taking off my warranty sticker, I tried with a little of heat but finally it broke . I don't know if send it without warranty sticker as if I had never brought it or try to find one.

EDIT: I think these stickers would do the job: 









						1.89US $ |360 Uds. De tarjeta gráfica AMD de 6 mm de diámetro, garantía, sin efecto si se quita, pegatinas de sello agrietadas V44|Adhesivos de papelería|   - AliExpress
					

¡Compra fácil, vive mejor!  Aliexpress.com




					es.aliexpress.com


----------



## Space Lynx (May 16, 2021)

kak4rot said:


> My stutters are not related with OC, even on stock setting I notice some microstutter. I have tried everything I have read here, reddit, amd forums, overclock.net. etc. I wonder if I should RMA this garbage



others have fixed the stutter by setting the Min OC and max OC.  your just setting the max higher but leaving min at 500.  i set my min to 2100, and max 2550, and have 0 issues.  but stutter could be from your cpu or anything overheating, no idea. i have 0 issues at stock or oc'ing.


----------



## Felix123BU (May 16, 2021)

kak4rot said:


> I wasn't lucky taking off my warranty sticker, I tried with a little of heat but finally it broke . I don't know if send it without warranty sticker as if I had never brought it or try to find one.
> 
> EDIT: I think these stickers would do the job:
> 
> ...


Damn, I did not even think about buying warranty stickers, fantastic idea   I would give you a prize for this


----------



## kak4rot (May 16, 2021)

lynx29 said:


> others have fixed the stutter by setting the Min OC and max OC.  your just setting the max higher but leaving min at 500.  i set my min to 2100, and max 2550, and have 0 issues.  but stutter could be from your cpu or anything overheating, no idea. i have 0 issues at stock or oc'ing.


I had no these microstutters in the same pc with a 1080 ti, I have tried the solution you have provided me too. Actually my settings are 2539 Mhz Max Core Frequency and 2439 Mhz Min Core Frequency. I have tried too with power saver options, fresh install of Windows, Clear CMOS, Bios stock settings, etc. All the advices I have read at the whole internet.


----------



## Space Lynx (May 16, 2021)

kak4rot said:


> I had no these microstutters in the same pc with a 1080 ti, I have tried the solution you have provided me too. Actually my settings are 2539 Mhz Max Core Frequency and 2439 Mhz Min Core Frequency. I have tried too with power saver options, fresh install of Windows, Clear CMOS, Bios stock settings, etc. All the advices I have read at the whole internet.



your min is too high.


----------



## kak4rot (May 16, 2021)

lynx29 said:


> your min is too high.


can it be harmful?  Any recommendation?



turbogear said:


> Yes I think you should RMA it if it stutters all time, but I am not sure if your warranty is still valid as you have opened it.
> 
> The stutter is very curious issue. My reference 6800XT also started stuttering but for me it happened after ESD issue.
> My 6800XT card has been in repair over a month now.
> ...


I didn't know it was taking amd so long to send replacement cards


----------



## Felix123BU (May 16, 2021)

kak4rot said:


> can it be harmful?  Any recommendation?


What is the age and wattage of your PSU? I know for most its a rhetorical question, but did you connect it with two separate power cables? What CPU and RAM do you have? Answer to those questions might give an insight into why you might get stuttering.


----------



## aRcane305 (May 16, 2021)

turbogear said:


> That looks good. Only thing is that check if 2150MHz on VRAM gives you higher performance or something less.
> On many 6800XT and 6900XT cards actually score degrades if memory is pushed to the limit.
> Many times less than 2130MHz gives actaully higher performance.


I see. I did some more testing, and managed to get it to 2817Mhz core and 2100Mhz vram stable in Heaven and Timespy. any reason i shouldnt keep it as it is? I know for a fact it hurts mining hash rates, but would daily use/gaming be fine? i’ll do more testing to compare performance. 6900XT Toxic fyi, people said 2.7 was barely possible without LN2, but i managed to hit it with afterburner


----------



## kak4rot (May 16, 2021)

Felix123BU said:


> What is the age and wattage of your PSU? I know for most its a rhetorical question, but did you connect it with two separate power cables? What CPU and RAM do you have? Answer to those questions might give an insight into why you might get stuttering.


My psu is a Cooler Master V1000 (1000w), as far as I know is bulit by Seasonic. It must be four or five years old. I use two separate pci cables to feed the monster.


----------



## Felix123BU (May 16, 2021)

kak4rot said:


> My psu is a Cooler Master V1000 (1000w), as far as I know is bulit by Seasonic. It must be four or five years old. I use two separate pci cables to feed the monster.


1000w is more than enough, but 4-5 years old could be an issue, power supplies rarely fail outright but over time they can deliver less power and more important less stable. I am not saying this is your case, might be, might not be, but if I where in your situation, I would either try to test it with another power supply, or test it in someone else's pc (which would be problematic since you put a waterblock on it), just to make sure its not the card. Any other things that might cause stutters are unstable overclocks on CPU and or RAM, which do not show up on day to day usage. One other thing could be a crappy display cable, had once that issue, all kinds of weirdness, weird screen stutter between them, new cable fixed it, but that is sort of rare.

As Turbogear said, its also possible that the card is physically defective, but for such a rare commodity as a GPU these days, I still would try to eliminate other potential problems not card related first.
If lets say you test it with another PSU, and it displays the same characteristics, RMA would be the way to go.


----------



## turbogear (May 16, 2021)

kak4rot said:


> can it be harmful?  Any recommendation?
> 
> 
> I didn't know it was taking amd so long to send replacement cards


In my case it is not AMD directly but rather XFX.
The reference card was XFX branded and the online shop that I bought it from sent it back to them.  



aRcane305 said:


> I see. I did some more testing, and managed to get it to 2817Mhz core and 2100Mhz vram stable in Heaven and Timespy. any reason i shouldnt keep it as it is? I know for a fact it hurts mining hash rates, but would daily use/gaming be fine? i’ll do more testing to compare performance. 6900XT Toxic fyi, people said 2.7 was barely possible without LN2, but i managed to hit it with afterburner


WOW 2817MHz is really great. 

How high did your score in Time Spy go?
Can you paste screen shot of your settings from Radeon Software as well as your MPT?

If the settings are really stable and the power consumption at this frequency is okay for you then you can leave it.
I usually run some stress test like Fire Strike Extreme Stress test to check the stability.
Or even Port Royal Stress test.
If these tests pass then the setting is stable.

Sometimes with Time Spy or Fire Strike Benchmarks, you can pass a run but on long term setting are not stable.
The stress tests helps to check the stability of the settings.

I know some guys with 6900XT Ultimate cards went to 500W Core Power and 384A TDC to get really high score and stability at that high frequencies like yours.
I am personally not comfortable of setting my 6900XTU to pull 500W on core. 
Maybe only for benchmark but not for daily use.

I am keeping mine at the moment at 2750MHz@1175mV for daily gaming which is really stable and power consumption target is 382W and 368A TDC (default for my card) but does not consume that much in games.
To be honest, with my current 2K 165Hz monitor , I have more than enough FPS in the games that I play. 

But for benchmarking I may go higher for short time. 
Still fighting with myself if I should bite the bullet and put LM  on my card to keep temperatures low when pushing it higher than 2750MHz.
The only reason not to move to LM is that after a few months the Nickel plated surface of the water block will start getting dull as well as the surface of the GPU Die gets some dullness which could cause issues if you RMA the card.

At least I found out that warranty stickers for my card can be bought on Aliexpress.


----------



## aRcane305 (May 17, 2021)

turbogear said:


> In my case it is not AMD directly but rather XFX.
> The reference card was XFX branded and the online shop that I bought it from sent it back to them.
> 
> 
> ...


Alright I will do more testing. 2735Mhz and 2150Mhz got me a 21609 gpu score in Timespy for a total of 17 850 (with a 5600x).
Actual sythetic clocks seem to be -50Mhz, giving a 2701Mhz in timespy.
I will do more test rounds to test, and i'll keep bumping up the core clocks.

Power consumption still hasn't passed 400W.

My settings are default bios from Techpowerup, use MPT to raise Power Limit (GPU) to 525W, TDC GFX to 525A and TDC SOC to 80A.
Then Wattman, custom settings, only thing i did was raise the power limit to 15%.
Then in Afterburner drag the vram and core clocks up.

Have yet to test difference between 2100, 2130, and 2150Mhz for vram. 2150 is the maximum the slider allows.

For the core clocks in afterburner, I can slide it up to 2860Mhz, for a real synthetic performance of 2817Mhz in Heaven and Timespy.

Anything over 2860Mhz in Afterburner leads to Heaven crashing.

I did these all yesterday late at night, have yet to properly test/benchmark.


Do you reckon i should mess with more settings in MPT?


I'll send screenshots after i get home and do more testing.


----------



## turbogear (May 17, 2021)

aRcane305 said:


> Alright I will do more testing. 2735Mhz and 2150Mhz got me a 21609 gpu score in Timespy for a total of 17 850 (with a 5600x).
> Actual sythetic clocks seem to be -50Mhz, giving a 2701Mhz in timespy.
> I will do more test rounds to test, and i'll keep bumping up the core clocks.
> 
> ...


Well your power settings are way too high. 
As far as I know Saphire Toxic has 2x8 pin and 1x6pin. So total specified PCIe power limit would be only 450W but not 525W.
Higher than that you are running the card at your own risk.

Your MPT setting:  525W and 525A TDC.
You added 15% to that with wattmann. This means your limits are set to: 603.75W GPU core and 603.75A TDC.

603.75W is not total GPU power in this case but only Core GPU power. You need to add to that about another 45W for other GPU components.
This means 648.75W is the total possible power that your card is allowed to draw from power supply over PCIe cables. 

So I am not sure if these settings are really safe or you could damage your card although you say you have max 400W used power.

Regarding frequency, yes it is like that for RDNA2 cards. You get max -50MHz to the setting you apply. So if you set it to 2817MHz than you will see max real frequency of 2767MHz.

I personally don't use Afterburner but do all OCing through Wattman.
Some users here on forum had some compatibility issues with Afterburner.


----------



## kak4rot (May 17, 2021)

Felix123BU said:


> 1000w is more than enough, but 4-5 years old could be an issue, power supplies rarely fail outright but over time they can deliver less power and more important less stable. I am not saying this is your case, might be, might not be, but if I where in your situation, I would either try to test it with another power supply, or test it in someone else's pc (which would be problematic since you put a waterblock on it), just to make sure its not the card. Any other things that might cause stutters are unstable overclocks on CPU and or RAM, which do not show up on day to day usage. One other thing could be a crappy display cable, had once that issue, all kinds of weirdness, weird screen stutter between them, new cable fixed it, but that is sort of rare.
> 
> As Turbogear said, its also possible that the card is physically defective, but for such a rare commodity as a GPU these days, I still would try to eliminate other potential problems not card related first.
> If lets say you test it with another PSU, and it displays the same characteristics, RMA would be the way to go.


When I notice stutters I can see a fps drop, for example: from 144 fps to 120fps, from 200 fps to 140 fps. I haven't these micro stutters with my oced 1080 ti in the same system.

I have tried with stock bios settings, clearing CMOS and a fresh Window installation.


----------



## aRcane305 (May 17, 2021)

turbogear said:


> Well your power settings are way too high.
> As far as I know Saphire Toxic has 2x8 pin and 1x6pin. So total specified PCIe power limit would be only 450W but not 525W.
> Higher than that you are running the card at your own risk.
> 
> ...


Regarding what you just said, i still have yet to touch 360W, let alone 400W so I reckon its alright for now. 

So if i where to change the MPT values, would i change power limit to 425W?
or to 370, because 370 + 15% = 425W.

also do i change anything else in MPT besides Power Limit GPU, TDC Limit GFX, and TDC Limit SOC?


----------



## turbogear (May 17, 2021)

aRcane305 said:


> Regarding what you just said, i still have yet to touch 360W, let alone 400W so I reckon its alright for now.
> 
> So if i where to change the MPT values, would i change power limit to 425W?
> or to 370, because 370 + 15% = 425W.
> ...


I did not change anything else in MPT.
Many of the other values there are not working anyways for RDNA2.
For example if one increases max voltage or max frequency then the card goes into safe mode and gets locked at 500MHz.

For power settings, you can do it either way. Both will work.
If you use HWINFO then you can also see what are the actual TDC and max Core power limits that are applied.
It can also shows things like current drawn by GPU core.

One thing to note is that if you just for example set in MPT 370W and 350ATDC and then use 15% slider in Wattman, you can see in HWINFO that the actual setting would be 425.5W and 402.5A TDC. So whatever values you put in MPT, these will go 15% higher for each individual value of Core Power and TDC.


----------



## aRcane305 (May 17, 2021)

turbogear said:


> I did not change anything else in MPT.
> Many of the other values there are not working anyways for RDNA2.
> For example if one increases max voltage or max frequency then the card goes into safe mode and gets locked at 500MHz.
> 
> ...


Ok, so unfortunately after a system reboot, my previous 2817Mhz is no longer Heaven stable, let alone timespy.

After a 2 hour run, 2800Mhz and 2130Mhz mem clock finally yielded a 1hr stable Heaven run. I will now run Timespy and Port Royal to test.

So "real" synthetic performance is 2750Mhz ish. 


I have one question, if the overclock is not stable in benchmarks at 99% utilisation, is it still safe to use the overclock in games if its stable in games?










turbogear said:


> I did not change anything else in MPT.
> Many of the other values there are not working anyways for RDNA2.
> For example if one increases max voltage or max frequency then the card goes into safe mode and gets locked at 500MHz.
> 
> ...


OK suprise suprise, currently running Timespy, and interestingly the overclocks applied in Wattman glitch out, while the overclocks applied in Afterburner were stable. The overclocks in the post above were all done in Wattman. 

This is wierd, i can somehow reach higher stable overclocks in Afterburner, compared to Wattman.


----------



## sepheronx (May 17, 2021)

kak4rot said:


> I had no these microstutters in the same pc with a 1080 ti, I have tried the solution you have provided me too. Actually my settings are 2539 Mhz Max Core Frequency and 2439 Mhz Min Core Frequency. I have tried too with power saver options, fresh install of Windows, Clear CMOS, Bios stock settings, etc. All the advices I have read at the whole internet.


So I fixed my stuttering issue of my 6800xt as I noticed it would stutter at random as it would drop to 500mhz and then shoot up.  The part where it drops and shoots up in terms of frequency, is where the microstutters come from.  The min freq is low and because the card may not be stressed enough in the game, it auto adjusts and it just sucks.  I raised my min to 2100mhz and my max is about 2300mhz (I did raise it more but not wanting to ruin life of gpu) and after doing that, my performance has been a lot better.  Min still goes low, like 1700mhz at randomly, but the microstutters were reduced considerably.

Its the crappy power management system.


----------



## turbogear (May 17, 2021)

aRcane305 said:


> Ok, so unfortunately after a system reboot, my previous 2817Mhz is no longer Heaven stable, let alone timespy.
> 
> After a 2 hour run, 2800Mhz and 2130Mhz mem clock finally yielded a 1hr stable Heaven run. I will now run Timespy and Port Royal to test.
> 
> ...


If the stress test is not 99% stable, it can still work in games, but it could happen that you have sometimes random game crashes depending on the games and if they can drive the GPU to limit.

For example I had issues before with Control and my 6800XT at 2750MHz@1020mV. This game crashed instantly although setting was 99.8% stable in stress tests. For Control I had to drop back to 2675MHz@985mV to get it stable.

You can try it and if the games you play are working fine then you can keep it.

Regarding what you said about now not able to achieve higher frequency after reboot, I had similar experiences in the past with not 100% stable settings.
Sometime they worked perfectly and all benchmarks would go through and next day this setting doesn't work.



aRcane305 said:


> OK suprise suprise, currently running Timespy, and interestingly the overclocks applied in Wattman glitch out, while the overclocks applied in Afterburner were stable. The overclocks in the post above were all done in Wattman.
> 
> This is wierd, i can somehow reach higher stable overclocks in Afterburner, compared to Wattman.



Okay, that is an interesting discovery.  
Seems like Afterburner is less bugy in this situation than Radeon Software.
Maybe I should try Afterburner. 
I never used that before for Ocing but only for monitoring. The only time I tried it I was unable to unlock OC feature.

You don't have TRIXX installed or?
I had 6900XT Toxic for a short time, and TRIXX was interfering with Wattmann. I returned that card at that time as my sample was not able to do more than 2620MHz real clock no matter how high I pushed it with power.


----------



## aRcane305 (May 17, 2021)

turbogear said:


> If the stress test is not 99% stable, it can still work in games, but it could happen that you have sometimes random game crashes depending on the games and if they can drive the GPU to limit.
> 
> For example I had issues before with Control and my 6800XT at 2750MHz@1020mV. This game crashed instantly although setting was 99.8% stable in stress tests. For Control I had to drop back to 2675MHz@985mV to get it stable.
> 
> ...



For now I'm back to using the Toxic Boost in TRIXX as my daily driver. Overclocking only improves fps and benchmark scores. 

Yes, I have TRIXX. Interesting thing today, even 2750Mhz is not stable for me in Timespy. Really wierd. 
I'll have a break today and do more testing/overclocking tomorrow. 

Here are some pics of my build if your interested. I'm hoping to get featured on PcPartpicker, if you could upvote that be awesome. 

*https://au.pcpartpicker.com/b/tqzNnQ#cx3954658 *


----------



## Raendor (May 17, 2021)

Ok, I’m not the club member anymore as I got rid of my 6800xt and traded it for 3060ti and a couple hundred cash until I get my hands on 3080. After month of struggle with amd drivers I just gave up. A bit sad, considering the raw power is great and stock model itself is outstanding. But success to all you guys and hope your experience is miles better than mine turned out to be!


----------



## Felix123BU (May 17, 2021)

kak4rot said:


> When I notice stutters I can see a fps drop, for example: from 144 fps to 120fps, from 200 fps to 140 fps. I haven't these micro stutters with my oced 1080 ti in the same system.
> 
> I have tried with stock bios settings, clearing CMOS and a fresh Window installation.


Wait, looking at those fps ranges you mentioned, is your display Freesync? One thing that can cause stutters is if your FPS goes over/under the Freesync range   
What would happen if lets say in said game you enable Vsync which would lock the FPS at your monitors refresh rate?


----------



## turbogear (May 17, 2021)

aRcane305 said:


> For now I'm back to using the Toxic Boost in TRIXX as my daily driver. Overclocking only improves fps and benchmark scores.
> 
> Yes, I have TRIXX. Interesting thing today, even 2750Mhz is not stable for me in Timespy. Really wierd.
> I'll have a break today and do more testing/overclocking tomorrow.
> ...


That's really a nice and clean build. It looks really beautiful.    

That somehow sounds similar to what I experienced with Toxic.
One time I was able to go to higher than 2750MHz at around 1050mV, but afterwards on next day I tried everything but I could not get it higher than 2675MHz@1090mV.
Never understood the reason but returned  the card as at that time I had a super 6800XT sample that went 2750MHz@1020mV and was only 5% slower than my Toxic while consuming much less power.
I kept 6800XT and returned Toxic.

Unfortunately 6800XT died due to my fault of not being careful with ESD and now I own 6900XTU.


----------



## kak4rot (May 17, 2021)

Felix123BU said:


> Wait, looking at those fps ranges you mentioned, is your display Freesync? One thing that can cause stutters is if your FPS goes over/under the Freesync range
> What would happen if lets say in said game you enable Vsync which would lock the FPS at your monitors refresh rate?


I disabled Freesync on both, wattman and monitor. And microstutters are present with vsync on and off


----------



## Felix123BU (May 17, 2021)

kak4rot said:


> I disabled Freesync on both, wattman and monitor. And microstutters are present with vsync on and off


Other than testing with another power supply, not a lot I can suggest since you did all the other potential issue checking, especially the BIOS reset which you said you tried which would have alleviated any RAM OC instability, so its either PSU related, or the display cable, or the GPU has a physical problem, in which case RMA is the way to go.


----------



## kak4rot (May 17, 2021)

Felix123BU said:


> Other than testing with another power supply, not a lot I can suggest since you did all the other potential issue checking, especially the BIOS reset which you said you tried which would have alleviated any RAM OC instability, so its either PSU related, or the display cable, or the GPU has a physical problem, in which case RMA is the way to go.


I don't know why, but I think my problem is related to the drivers. With the settings that you have recommended to me here and in other forums, the microstutter has decreased a lot and now they are related to isolated transitions from low or medium to high load in games, because in benchmarks it always goes full.

But his thing would be to try it on another PC, for this I would have to drain the circuit and reassemble the original heatsink... what laziness


----------



## Felix123BU (May 17, 2021)

kak4rot said:


> I don't know why, but I think my problem is related to the drivers. With the settings that you have recommended to me here and in other forums, the microstutter has decreased a lot and now they are related to isolated transitions from low or medium to high load in games, because in benchmarks it always goes full.
> 
> But his thing would be to try it on another PC, for this I would have to drain the circuit and reassemble the original heatsink... what laziness


I soo know the feeling   , water makes things lots cooler, but when it comes to even taking out anything..... drain the loop it is  oh well, life is not perfect 

Easier though if you would have an extra psu, but who keeps a brand new or newer psu not in his case 

Why i am suggesting some things, because my experience with my 6800XT has been almost flawless, so i guess or hope for you it's either something that can be fixed, the alternative would be some hardware issue and a a RMA.


----------



## Godhand007 (May 23, 2021)

So have people tried any Ray Tracing benchmarks on 6900Xt/6800XT cards? It seems that both these cards are placed between 3070 and 3080 for RT performance (reasonable settings).


----------



## Felix123BU (May 23, 2021)

Godhand007 said:


> So have people tried any Ray Tracing benchmarks on 6900Xt/6800XT cards? It seems that both these cards are placed between 3070 and 3080 for RT performance (reasonable settings).


Not so much benchmarks, but in RT games they are between the 3070 and 3080 in regards to RT performance generally speaking.


----------



## xenosys (May 23, 2021)

Godhand007 said:


> So have people tried any Ray Tracing benchmarks on 6900Xt/6800XT cards? It seems that both these cards are placed between 3070 and 3080 for RT performance (reasonable settings).



Native RT performance for the 6800XT is between the 3070 and 3080 indeed, with the 6900XT being closer to the 3080 and 6800XT closer to the 3070 in terms of 3D Mark scores.

The RTX 3070 averages around 8300 in Port Royale, whereas the 6800XT is around 9500, 6900XT is 10300 and RTX 3080 is 11500.  I can personally hit around 10600 in PR with my 6800XT under water and pushing 2.7Ghz, so if you manage to luck out and get yourself a good chip, you can push it a bit more.

In games however, there isn't much to separate the 6800XT and 3070.  Watchdogs Legion delivers better performance via the 6800XT by a couple of frames when RT is turned on for example.


----------



## Godhand007 (May 23, 2021)

Felix123BU said:


> Not so much benchmarks, but in RT games they are between the 3070 and 3080 in regards to RT performance generally speaking.


I was going by performance in Metro EE, synthetics and some AMD sponsored titles. Though if one is limited completely by RT, it seems that 6900XT barely exceeds 3070, at least in Metro EE. The thing is, game scenes are not just completely rendered RT effects and thas why RX 6900 XT can almost match rtx 3080 in Metro EE benchmark with all RT effects on but it falls behind in RT limited scenarios. So the problem is we don't know what amount of RT is good enough and what is pasted in RTX games (almost all sponsored by Nvidia) like 64x Hairworks or Teseelation times 30.



xenosys said:


> Native RT performance for the 6800XT is between the 3070 and 3080 indeed, with the 6900XT being closer to the 3080 and 6800XT closer to the 3070 in terms of 3D Mark scores.
> 
> The RTX 3070 averages around 8300 in Port Royale, whereas the 6800XT is around 9500, 6900XT is 10300 and RTX 3080 is 11500.  I can personally hit around 10600 in PR with my 6800XT under water and pushing 2.7Ghz, so if you manage to luck out and get yourself a good chip, you can push it a bit more.


Yeah. That seems about right. I can get ~11100 Plus on my RX 6900 XT. The thing is, is this the correct performance for reasonable RT? For e.g. For RT Normal has good performance below but RT Ultra just tanks 6900 XT. Also, watch that they have turned down the raster setting one notch to make scenarios RT limited since increasing the raster setting helps alleviate stress on AMD in this particular title.


----------



## Space Lynx (May 23, 2021)

I wish ray tracing never happened, if they had focused instead on raw FPS, we would all be gaming at 240hz 1440p 240fps with no dips, and our mouths would drop at how smooth and immersive games had become. Ray tracing is and always will be too detrimental to FPS hit even with DLSS, the few games that support RT only a handful of them even make it look good. 

oh well.  /shrug


----------



## Godhand007 (May 23, 2021)

lynx29 said:


> I wish ray tracing never happened, if they had focused instead on raw FPS, we would all be gaming at 240hz 1440p 240fps with no dips, and our mouths would drop at how smooth and immersive games had become. Ray tracing is and always will be too detrimental to FPS hit even with DLSS, the few games that support RT only a handful of them even make it look good.
> 
> oh well.  /shrug


Agree with this to an extent. Though people will say that GPUs were already reaching the limits of rasteriaztion and Nvidia took a genius step toward bringing something new (of course RT had been around for some time before that). We also need CPUs and Game engines to support proper 144 fps gameplay let alone 240 fps (e-sports titles excluded).

About games looking good with RT, shadows, and lighting are good but this mirror-like puddle reflections craze is beyond me.


----------



## xenosys (May 23, 2021)

Godhand007 said:


> I was going by performance in Metro EE, synthetics and some AMD sponsored titles. Though if one is limited completely by RT, it seems that 6900XT barely exceeds 3070, at least in Metro EE. The thing is, game scenes are not just completely rendered RT effects and thas why RX 6900 XT can almost match rtx 3080 in Metro EE benchmark with all RT effects on but it falls behind in RT limited scenarios. So the problem is we don't know what amount of RT is good enough and what is pasted in RTX games (almost all sponsored by Nvidia) like 64x Hairworks or Teseelation times 30.
> 
> 
> Yeah. That seems about right. I can get ~11100 Plus on my RX 6900 XT. The thing is, is this the correct performance for reasonable RT? For e.g. For RT Normal has good performance below but RT Ultra just tanks 6900 XT. Also, watch that they have turned down the raster setting one notch to make scenarios RT limited since increasing the raster setting helps alleviate stress on AMD in this particular title.
> ...



 Interesting graph, thanks.  It would seem that even the 2080ti/3070 stands up well to the 6900XT when RT is extensively utilised.  The less RT becomes a factor, the further away the 6900XT pulls as you say.


----------



## Godhand007 (May 24, 2021)

xenosys said:


> Interesting graph, thanks.  It would seem that even the 2080ti/3070 stands up well to the 6900XT when RT is extensively utilised.  The less RT becomes a factor, the further away the 6900XT pulls as you say.


But that's the point. Is RT being heavily utilized? The problem is that we don't have a proper RT showcase game from AMD to give us an alternative to Nvidia-sponsored RTX games. Also, you might want to read the relevant bits of the article as it seems that Hairworks might be causing huge performance dips on AMD. The best sort of "objective" test that we have is 3D Mark Port Royal where RX 6900 is ~10% behind an RTX 3080.


----------



## Chomiq (May 24, 2021)

RT is not going away, it is here to stay and is needed for actual progress in realtime 3D graphics.


----------



## Godhand007 (May 24, 2021)

Chomiq said:


> RT is not going away, it is here to stay and is needed for actual progress in realtime 3D graphics.


I don't think anyone here is saying anything remotely close to that. There are many things needed for progress in graphics Technology, from HDR, AI Upscaling/conversion, RT, and many other things if one is talking about 3D games.


----------



## Kabouter Plop (May 28, 2021)

I got a 6900 XT a month ago put it on water i get gpu temps now of 50c and 65c max hotspot usually avg out at 60c 62c it only peeks for like 1 second to 65c with a triple 360mm radiator setup fans only running about avg 1000 rpm usually very silent with room temp around 22c
This is my first fit if heard thermal pads on some chips ekwb recommends can sometimes vary in height, altho if seen users that failed fit their block with hotspot temps above 80c and 90c so i do not think i fitted it improperly, thermal paste if used X patern with mx4 temps have improved it started at 68c max hotspot temp 1 hour after refilling loop before i bleeded out all air, if i set my old fan profile i had with gtx 1080 my gpu idles at 37c hotspot else it idles at 33c hotspot my fans do stop spinning a little at idle, while if i use my old profile it stops spinning for a lot longer.


----------



## Felix123BU (May 28, 2021)

Kabouter Plop said:


> I got a 6900 XT a month ago put it on water i get gpu temps now of 50c and 65c max hotspot usually avg out at 60c 62c it only peeks for like 1 second to 65c with a triple 360mm radiator setup fans only running about avg 1000 rpm usually very silent with room temp around 22c
> This is my first fit if heard thermal pads on some chips ekwb recommends can sometimes vary in height, altho if seen users that failed fit their block with hotspot temps above 80c and 90c so i do not think i fitted it improperly, thermal paste if used X patern with mx4 temps have improved it started at 68c max hotspot temp 1 hour after refilling loop before i bleeded out all air, if i set my old fan profile i had with gtx 1080 my gpu idles at 37c hotspot else it idles at 33c hotspot my fans do stop spinning a little at idle, while if i use my old profile it stops spinning for a lot longer.


Those temps are perfectly fine, no need to worry


----------



## Kabouter Plop (May 28, 2021)

Felix123BU said:


> Those temps are perfectly fine, no need to worry



What kinda temps you get with rage profile 270.300w power limit on water ?


----------



## Felix123BU (May 28, 2021)

Kabouter Plop said:


> What kinda temps you get with rage profile 270.300w power limit on water ?


My 6800XT on water with slightly raised power limits via MPT(MorePowerTool) (so around 350w max) gets 29c idle, around 43c on load, around 56c on hotspot, but that with Liquid Metal between the block and chip, with normal paste I got basically the same temps you now have. LM reduced temps by around 10c with the same setup.


----------



## Kabouter Plop (May 28, 2021)

Don't think i would dare mess with LM and risk damaging the card, altho i am sure it could be aplied safely.


----------



## Felix123BU (May 28, 2021)

Kabouter Plop said:


> Don't think i would dare mess with LM and risk damaging the card, altho i am sure it could be aplied safely.


Well, LM is not really needed except if you want the lowest possible temps, that's why I said your temps are perfectly fine as they are


----------



## turbogear (May 29, 2021)

Felix123BU said:


> Well, LM is not really needed except if you want the lowest possible temps, that's why I said your temps are perfectly fine as they are


Yes especially on 6800XT LM is not a must.
With good thermal pastes like Thermal Grizzly Kryonaut or MX-4/MX-5 one can get decent temperatures but of course nothing compared to LM. 

The LM is amazing, I was hitting 92°C on Hotspot at 400W in benchmarks like Fire Strike with my 6900XTU Liquid Devil Ultimate with original factory paste. 
I have changed to LM and the results are unbelievable. 
In same benchmarks I don't go higher than 65°C on hotspot with 10°C delta to GPU temperature. 
In Furmark, hotspot is staying around 75°C.

Until now I had been running it at 2750MHz@1175mV with 2550MHz on min frequency bar without memory OC.

Now I  try to tune the 6900XTU to get best possible clock after changing to LM.
I limited myself to 415W Core power and 400A TDC. I was able to get 2750MHz max bar and 2600MHz min bar with 1085mV but score in Timespy (21400) remains similar to what I had before at 2750MHz@1175mV because real frequency is in same range of 2690MHz-2700MHz. The power consumption could be lower as lower voltage is applied now.
Note this is without Memory OC. I have not Oced memory at the moment as I first want to get stable Oc on GPU before going to memory as well.

In Heaven benchmark I can run it for hours at 2850MHz max bar and 2700MHz min bar at 1100mV without issues. Clock in this benchmark hover in range of 2750MHz-2775MHz, but any setting higher than 2750HMz max clock is unstable in 3DMARK benchmarks though.
Fire Strike test for example for setting 2850MHz stop after a few minutes though power consumption doesn't go higher than 415W at the point where it stops.
Tried also to play with voltages upto 1200mV to try to get 2850Mhz stable in Fire Strike but until now was not able to get any 3DMARK benchmarks to complete for Frequencies higher than 2750MHz. 

Somehow 2750MHz seems like a brick wall until now for me.


----------



## Butanding1987 (May 29, 2021)

I knew you couldn’t resist it.


----------



## xenosys (May 29, 2021)

I'm currently getting around 53-55 degrees (hotspot of between 70 and 75) under water with Kryonaut on my 6800XT drawing around 370w overclocked to 2.7Ghz (2112Mhz Fast Timing Memory).

What sort of temp drops would I likely get by switching to LM with the above settings?  5 degrees? 10 degrees?


----------



## Kabouter Plop (May 29, 2021)

Probably 10c but liquid metal can destroy aluminium potentially damage gpu if not used correctly, its probably also conducting electricity so you dont want it touching parts it should't altho i am not sure about that last one.


----------



## Felix123BU (May 29, 2021)

xenosys said:


> I'm currently getting around 53-55 degrees (hotspot of between 70 and 75) under water with Kryonaut on my 6800XT drawing around 370w overclocked to 2.7Ghz (2112Mhz Fast Timing Memory).
> 
> What sort of temp drops would I likely get by switching to LM with the above settings?  5 degrees? 10 degrees?


my 6800XT with LM was -12c on hotspot and -10 and core vs Kryonaut


----------



## xenosys (May 29, 2021)

Kabouter Plop said:


> Probably 10c but liquid metal can destroy aluminium potentially damage gpu if not used correctly, its probably also conducting electricity so you dont want it touching parts it should't altho i am not sure about that last one.



Really? Interesting.  I've heard people use nail polish around the GPU die to ensure the LM doesn't short any components it may touch should it leak/run.  I've also got myself an EK Quantum Vector block which is Nickel-plated Copper so that shouldn't pose a problem.



Felix123BU said:


> my 6800XT with LM was -12c on hotspot and -10 and core vs Kryonaut



That's amazing.  So I could be looking at anything around 45 degrees with LM and a hotspot of 60-65 at full load.  I think I've pushed my card as far as it can go when overclocking.  I doubt it'll be able to push it higher than 2.7Ghz stable in games but the temp drop is worth considering.


----------



## Godhand007 (May 29, 2021)

Seeing guys here with such low temps on their cards; I am just thinking how much my reference card is being help up by temps. My current settings are 2665Mhz (max), 2150MHz Mem ,~430W, 1175mv (can't get stability on anything lower)  which are stress tests stable. Clocks always remain above 2600MHz but junction temp can reach 110C for certain extreme situations which forces a downclock. Any Ideas on how I can increase my clocks and get to sweet that 2.7 GHz game stable settings while on air?

Also, are people here not afraid of voiding their warranty when messing around with cards?


----------



## Felix123BU (May 29, 2021)

xenosys said:


> Really? Interesting.  I've heard people use nail polish around the GPU die to ensure the LM doesn't short any components it may touch should it leak/run.  I've also got myself an EK Quantum Vector block which is Nickel-plated Copper so that shouldn't pose a problem.
> 
> 
> 
> That's amazing.  So I could be looking at anything around 45 degrees with LM and a hotspot of 60-65 at full load.  I think I've pushed my card as far as it can go when overclocking.  I doubt it'll be able to push it higher than 2.7Ghz stable in games but the temp drop is worth considering.


Nickel-plated Copper would not have issues with LM, only copper and aluminum have issues with LM,  aluminum has most problem with LM, copper a bit less, and nickel plated basically none, except maybe some staining but not corrosion. In the past I also used LM on a copper heatsink, id did some very minor damage, but nothing to bad. On nickel LM is almost harmless in general.

I was amazed by how much the temps dropped after applying LM on the chip, was expecting a good drop but not as much  I used normal MX4 thermal paste to cover everything around the chip so that if any LM drop would travel somewhere else, it would not cause any damage, since its non conductive and easy to apply and cover, and also easy to remove, plus being a heat conductor 

For the 6800XT, 2.7ish is the max it can go, I read somewhere there was a good rason for AMD to put 2.8 max, since the logic inside the chip could not handle more than 2.8. The only reason for applying LM is to have great temps, and maybe to help stabilize your max overclock. For me it was 100% worth it, just for the temps, I am also getting 2.7, 2725mhz at best. 

Temps are around 44-45c on chip and 56-57 max on hotspot with a 350w max limit, usually using 310-320w.



Godhand007 said:


> Seeing guys here with such low temps on their cards; I am just thinking how much my reference card is being help up by temps. My current settings are 2665Mhz (max), 2150MHz Mem ,~430W, 1175mv (can't get stability on anything lower)  which are stress tests stable. Clocks always remain above 2600MHz but junction temp can reach 110C for certain extreme situations which forces a downclock. Any Ideas on how I can increase my clocks and get to sweet that 2.7 GHz game stable settings while on air?
> 
> Also, are people here not afraid of voiding their warranty when messing around with cards?


"Also, are people here not afraid of voiding their warranty when messing around with cards?" yes and no 
Yes because GPUs are such a rare commodity these days, and no because I have done this things countless times in the past and not afraid of doing it again, also if I would be scared of tuning any piece of hardware, I would buy a console and live happily ever after  Plus its no fun if you don't take any risks


----------



## Godhand007 (May 29, 2021)

Felix123BU said:


> "Also, are people here not afraid of voiding their warranty when messing around with cards?" yes and no
> Yes because GPUs are such a rare commodity these days, and no because I have done this things countless times in the past and not afraid of doing it again, also if I would be scared of tuning any piece of hardware, I would buy a console and live happily ever after  Plus its no fun if you don't take any risks


I am not opposed to tweaking hardware or else I wouldn't be pushing my card so hard on Air. The point about the warranty is that sticker removal or evidence of change is introduced while doing thermal paste change etc., which is almost a guaranteed warranty rejection with few exceptions. It's fun to do bungee jumping but pays to keep the cord attached.

These are my settings for reference RX 6900 XT. What are your thoughts? Am I overdoing it or being too moderate ?


----------



## Felix123BU (May 29, 2021)

Godhand007 said:


> I am not opposed to tweaking hardware or else I wouldn't be pushing my card so hard on Air. The point about the warranty is that sticker removal or evidence of change is introduced while doing thermal paste change etc., which is almost a guaranteed warranty rejection with few exceptions. It's fun to do bungee jumping but pays to keep the cord attached.


I got the best idea ever from this forum, buying warranty stickers, there is a discussion on the topic here  . The only thing that would be worrisome for me with the reference cards is that at least for the 6800XT, they come with a thermal pad, not paste, and that is hardly salvageable, that could be a reason for warranty rejection, but then again, if the stickers are intact, you can make the case that's how you got it, I know, its far-fetched. Another issue with the reference cards, and it happened with mine, I tried to put paste on it, removed the pad, had the unpleasant surprise that the cold plate was not very even and I could not get hotspot under control anymore, that made me get a waterblock. I was relatively happy with the stock cooler, but I "needed" to thinker with it 

From looking at your settings, they are not overly aggressive, your problem would be the reference cooler has its limits, and if it also came with a thermal pad and not paste, that would not allow it to much OC room anyway. You could try to replace the paste, but that might not work, did not work for me, and the cooler itself is perfectly fine for stock settings, but not really for hard OC's.


----------



## Butanding1987 (May 29, 2021)

Felix123BU said:


> Nickel-plated Copper would not have issues with LM, only copper and aluminum have issues with LM,  aluminum has most problem with LM, copper a bit less, and nickel plated basically none, except maybe some staining but not corrosion. In the past I also used LM on a copper heatsink, id did some very minor damage, but nothing to bad. On nickel LM is almost harmless in general.


The effect of liquid metal on copper is merely  cosmetic. Functionally, temps should not be affected by years of use.


----------



## Felix123BU (May 29, 2021)

Butanding1987 said:


> The effect of liquid metal on copper is merely  cosmetic. Functionally, temps should not be affected by years of use.


yup, those where my observation too after using LM on a copper based aio for 2 years.


----------



## Godhand007 (May 29, 2021)

Felix123BU said:


> I got the best idea ever from this forum, buying warranty stickers, there is a discussion on the topic here  . The only thing that would be worrisome for me with the reference cards is that at least for the 6800XT, they come with a thermal pad, not paste, and that is hardly salvageable, that could be a reason for warranty rejection, but then again, if the stickers are intact, you can make the case that's how you got it, I know, its far-fetched. Another issue with the reference cards, and it happened with mine, I tried to put paste on it, removed the pad, had the unpleasant surprise that the cold plate was not very even and I could not get hotspot under control anymore, that made me get a waterblock. I was relatively happy with the stock cooler, but I "needed" to thinker with it
> 
> From looking at your settings, they are not overly aggressive, your problem would be the reference cooler has its limits, and if it also came with a thermal pad and not paste, that would not allow it to much OC room anyway. You could try to replace the paste, but that might not work, did not work for me, and the cooler itself is perfectly fine for stock settings, but not really for hard OC's.


Appreciate your inputs. You seem to have tinkered a lot with the card. Do you have any idea why fan speed never goes up to 100% or 3300 RPM? For me, the max fan speed percentage is 93% in GPU-Z, and max rpm is around ~3080.


----------



## Butanding1987 (May 29, 2021)

@Godhand007, you should try “undervolting” the 6900 XT to get better performance.


----------



## Felix123BU (May 29, 2021)

Godhand007 said:


> Appreciate your inputs. You seem to have tinkered a lot with the card. Do you have any idea why fan speed never goes up to 100% or 3300 RPM? For me, the max fan speed percentage is 93% in GPU-Z, and max rpm is around ~3080.


The max is just a max, sometimes fans can reach the theoretical max, sometimes not, its normal with most fans (93% if accurate seems on the lower side of things, most fans reach at least 95%). And as @Butanding1987 said, trying to undervolt might help you lower the temps, though its not a guarantee that it will also work. But does not hurt to try. I would start from 1125mv and go down in steps of 10


----------



## turbogear (May 29, 2021)

Butanding1987 said:


> I knew you couldn’t resist it.


Yes I know every time I say I will not do it  but really cannot resist. 
It is simply the tweaker genes in me that at end overtakes control and forces me to do such things. 

By the way, I got refund for 6800XT that I destroyed before with ESD. 
So the price 6900XTU doesn't heart anymore as re-calcuting the money that I got back from 6800XT, I end up paying 1297€ for 6900XTU Liquid Devil Ultimate.
That is price that regular 6900XT had few months ago. Now even standard 6900XT costs 2000€.



Kabouter Plop said:


> Probably 10c but liquid metal can destroy aluminium potentially damage gpu if not used correctly, its probably also conducting electricity so you dont want it touching parts it should't altho i am not sure about that last one.


If one is careful than LM is not a big problem.
I have been using it for many years.

As @Felix123BU mention, it should not be used with Aluminum as that will be destroyed.  On Copper there is chemical reaction which makes Galium inside Liquid Metal to combine with copper and permanently causing de-coloring but heatsink is not destroyed.
With Nickel, there is small staining after some time but it does not cause any major issues.

Regarding application, I filled the area around the GPU Die with MX-5 covering any exposed componenets. MX-5 is not conductive and therefore it is insulating the components in case any LM runs away.
It is also easy to remove than for example nail polish.

If one is careful with the quantity of LM then it does not run away. You need a very very tiny amount which is more than enough to cover whole GPU die.
I used Thermal Grizzly Conductonaut which comes with a tool like ear cotton swab to spread LM onto Die and if after applying you have liquid still floating around on the Die than you applied too much.
Too much is a problem as this will run away and spread around covering the area around the Die.


----------



## Godhand007 (May 29, 2021)

Butanding1987 said:


> @Godhand007, you should try “undervolting” the 6900 XT to get better performance.


Unfortunately, Current clocks are not stable without 1175mv.



Felix123BU said:


> The max is just a max, sometimes fans can reach the theoretical max, sometimes not, its normal with most fans (93% if accurate seems on the lower side of things, most fans reach at least 95%). And as @Butanding1987 said, trying to undervolt might help you lower the temps, though its not a guarantee that it will also work. But does not hurt to try. I would start from 1125mv and go down in steps of 10


This is the first time I have encountered this issue. Fans for a multitude of GPU/CPUs have always reached the max for me. Someone was suggesting that this might be a quirk of RX 6900 XT reference cards but difficult to say what is true.


----------



## Wrathier (May 29, 2021)

I just got owner to a https://www.amd.com/en/products/graphics/amd-radeon-rx-6800-xt

Am I in for a thrill? I sold my RTX 3090 this card was 1200$ cheaper at MSRP so I went for it.

Anyone want to share some experiences?


----------



## turbogear (May 29, 2021)

Godhand007 said:


> Unfortunately, Current clocks are not stable without 1175mv.


Yes 6900XT don't like that much undervolt.
That's even the case for my XTXH card.
My 6900XTU can by default 1200mV. 1175mV is the voltage that I have been using for 2750MHz.

I got it now down to 1085mV by tightening the lower frequency limit but I still need to run stress test to see if that would always be stable. I am not too confident as my 6900XT likes higher voltages.

With my 6800XT before I could set it 2750MHz@1020mV. I still feel very sad when I remember that my super 6800XT died. 



Wrathier said:


> I just got owner to a https://www.amd.com/en/products/graphics/amd-radeon-rx-6800-xt
> 
> Am I in for a thrill? I sold my RTX 3090 this card was 1200$ cheaper at MSRP so I went for it.
> 
> Anyone want to share some experiences?


Welcome to the club. 
6800XT are nice cards.
Interesting that you went from 3090 to 6800XT. 
You are lucky that you could grab one directly from AMD. 
That's the only place where you can get it on MSRP.

I tried myself twice to get a new one from there after my 6800XT died with ESD shock, but it was almost impossible as they were sold out each time within seconds while I was still trying to pay for it with PayPal.


----------



## Felix123BU (May 29, 2021)

Wrathier said:


> I just got owner to a https://www.amd.com/en/products/graphics/amd-radeon-rx-6800-xt
> 
> Am I in for a thrill? I sold my RTX 3090 this card was 1200$ cheaper at MSRP so I went for it.
> 
> Anyone want to share some experiences?


If you are not very unlucky, you can overclock it and it would be very very close to a 3090, except in RT where its a bit behind, but great card anyway for the price, enjoy it and welcome


----------



## Godhand007 (May 29, 2021)

turbogear said:


> Yes 6900XT don't like that much undervolt.
> That's even the case for my XTXH card.
> My 6900XTU can by default 1200mV. 1175mV is the voltage that I have been using for 2750MHz.
> 
> ...


I guess I have to be ok with sustained 2600Mhz+ frequencies along with full Memory OC. I suspect I could have gotten 60-70MHz more if I was on the water, but for the price (_got it at MSRP_), it's more than enough performance. If RT performance was around 20-30 % better (_I had a 2080ti before this_) then this would have been the perfect card for me. I just hope that those clocks are stable and I don't have to lower them any further.

BTW, you got very lucky that you got a refund for a tinkered rx 6800 xt.



Wrathier said:


> I just got owner to a https://www.amd.com/en/products/graphics/amd-radeon-rx-6800-xt
> 
> Am I in for a thrill? I sold my RTX 3090 this card was 1200$ cheaper at MSRP so I went for it.
> 
> Anyone want to share some experiences?


Just be sure to set your expectations reasonably (_it's not the best of the best of the best card_) and it should provide you more than enough performance to enjoy the latest titles with lowered RT settings.


----------



## Godhand007 (May 30, 2021)

So any thoughts on the new Unreal Engine 5 demo? I ran it on my system and was not impressed. Had to delete it as it along with other Unreal Engine 5 crap was taking around 200GB of prime SSD space.


----------



## Wrathier (May 30, 2021)

Godhand007 said:


> Just be sure to set your expectations reasonably (_it's not the best of the best of the best card_) and it should provide you more than enough performance to enjoy the latest titles with lowered RT settings.



I actually think it was the best of the best of the best.  3090->6900XT->6800XT. When a RTX 3080 Ti hits the market, Nvidia might be relevant again.


----------



## Godhand007 (May 30, 2021)

Wrathier said:


> I actually think it was the best of the best of the best.  3090->6900XT->6800XT. When a RTX 3080 Ti hits the market, Nvidia might be relevant again.


6900XT is the best card for 1440p raster. 3090 is overall the best. 

Both companies are relevant these days because of GPU supply shortages. Prices for 3080ti models are going to be more than the reference price of 3090. Unless I am missing something, 3080ti custom models would be beaten by 6900XT custom models with the exception of RT benchmarks. So it is going to be a dud from a normal consumer's perspective.


----------



## Kabouter Plop (May 31, 2021)

Haven't really tryit pushing my 6900 XT yet it boost to 2,5 ghz out of box 2,555 with rage profile, im still at 1080p until late june when i get my 2e shot and enough money to buy samsung g9 odyssey


----------



## Godhand007 (May 31, 2021)

Kabouter Plop said:


> Haven't really tryit pushing my 6900 XT yet it boost to 2,5 ghz out of box 2,555 with rage profile, im still at 1080p until late june when i get my 2e shot and enough money to buy samsung g9 odyssey


Which custom model do you have and do those clocks sustain in-game? Also, when you move to 1440p or higher, you would see some reduction in those boost clocks.


----------



## Felix123BU (May 31, 2021)

Godhand007 said:


> Which custom model do you have and do those clocks sustain in-game? Also, when you move to 1440p or higher, you would see some reduction in those boost clocks.


Not really, I play at 3440x1440, which in regards to pixel density is between 2k and 4k, and I hold the exact same clocks regardless of resolution on my 6800XT, granted, its on water, but could hold almost the same clocks with the stock reference cooler, but higher temps.


----------



## Wrathier (May 31, 2021)

I as mentioned sold my 3090 - since I got the 6800 XT at MSRP - 1200usd less than what I payed for my 3090 I must that the 6080 XT are a better card to be honest. Ray tracing etc needs time to mature I believe - and all the important titles like Cyberpunk 2077 etc etc I have completed. Now I play mostly AC Valhalla and games like Forza Horizon 4 etc, Watch Dogs Legion etc and those games excel on the AMD card. However, if I could get a good price and have money for it I might snack a 3080 Ti. I think the 3080s vram amount are going to be an issue later on.

I play on the Odyssey G9 5120x1440p. I have high expectations to the following: Down scaled image to 1800p can look as good as 4K giving like a 30% boost for free and actually a lot of reviews says it's better than DLSS. As my card is an AMD Radeon RX 6800 XT Black Edition I'm so lucky that Trixx works and that should be even better than AMD's only Image Sharpening.

The only real thing I can complain about are the 40+mhs lost on Eth mining. That is a bit annoying.


----------



## Kabouter Plop (May 31, 2021)

Godhand007 said:


> Which custom model do you have and do those clocks sustain in-game? Also, when you move to 1440p or higher, you would see some reduction in those boost clocks.



Its just a AMD gpu directly from amd.com


----------



## Felix123BU (May 31, 2021)

Wrathier said:


> I as mentioned sold my 3090 - since I got the 6800 XT at MSRP - 1200usd less than what I payed for my 3090 I must that the 6080 XT are a better card to be honest. Ray tracing etc needs time to mature I believe - and all the important titles like Cyberpunk 2077 etc etc I have completed. Now I play mostly AC Valhalla and games like Forza Horizon 4 etc, Watch Dogs Legion etc and those games excel on the AMD card. However, if I could get a good price and have money for it I might snack a 3080 Ti. I think the 3080s vram amount are going to be an issue later on.
> 
> I play on the Odyssey G9 5120x1440p. I have high expectations to the following: Down scaled image to 1800p can look as good as 4K giving like a 30% boost for free and actually a lot of reviews says it's better than DLSS. As my card is an AMD Radeon RX 6800 XT Black Edition I'm so lucky that Trixx works and that should be even better than AMD's only Image Sharpening.
> 
> The only real thing I can complain about are the 40+mhs lost on Eth mining. That is a bit annoying.


 yeah, the 6800XT is not as good for mining, but speaking from own experience, I get  62mhs at 130w, its quite power efficient even though not as much of a hash monster as Ampere


----------



## Godhand007 (May 31, 2021)

Felix123BU said:


> Not really, I play at 3440x1440, which in regards to pixel density is between 2k and 4k, and I hold the exact same clocks regardless of resolution on my 6800XT, granted, its on water, but could hold almost the same clocks with the stock reference cooler, but higher temps.


Have to disagree on that. As resolution increases clocks go down for me, not by a huge margin but certainly not insignificant. Temps remain mostly the same. Also, I believe this is the case for almost all modern cards i.e. clocks go down on higher resolutions. That water block might be the thing making a difference in your case though.



Wrathier said:


> I as mentioned sold my 3090 - since I got the 6800 XT at MSRP - 1200usd less than what I payed for my 3090 I must that the 6080 XT are a better card to be honest. Ray tracing etc needs time to mature I believe - and all the important titles like Cyberpunk 2077 etc etc I have completed. Now I play mostly AC Valhalla and games like Forza Horizon 4 etc, Watch Dogs Legion etc and those games excel on the AMD card. However, if I could get a good price and have money for it I might snack a 3080 Ti. I think the 3080s vram amount are going to be an issue later on.
> 
> I play on the Odyssey G9 5120x1440p. I have high expectations to the following: Down scaled image to 1800p can look as good as 4K giving like a 30% boost for free and actually a lot of reviews says it's better than DLSS. As my card is an AMD Radeon RX 6800 XT Black Edition I'm so lucky that Trixx works and that should be even better than AMD's only Image Sharpening.
> 
> The only real thing I can complain about are the 40+mhs lost on Eth mining. That is a bit annoying.


Well if you are happy with your card then that is one of the most important things checked already. Though I must say that one can always justify their purchase based on the games that they play, but overall 
3090 is a better card (but twice as costly as well). I bought an RX 6900 XT because I wanted the best performance at 1440p. Nvidia was not a clear winner here and with the OC potential of RX 6900 XT, I knew I made the right decision or at least the most optimum decsion. The lack of RT performance does hurt a bit though. 

BTW, Is _Watch Dogs Legion_ really performing better on AMD hardware? I presume you are talking about performance with RT off.


----------



## Wrathier (May 31, 2021)

Godhand007 said:


> Have to disagree on that. As resolution increases clocks go down for me, not by a huge margin but certainly not insignificant. Temps remain mostly the same. Also, I believe this is the case for almost all modern cards i.e. clocks go down on higher resolutions. That water block might be the thing making a difference in your case though.
> 
> 
> Well if you are happy with your card then that is one of the most important things checked already. Though I must say that one can always justify their purchase based on the games that they play, but overall
> ...



Happy and happy... you know that is big words. I wasn't 100 percent happy with the performance of the 3090 even overclocked and I think that Image scaling with Trixx https://www.sapphiretech.com/en/software and AMD Images sharpening will make up for the fps I lose in some games. I don't know exactly how it performs at the same settings as the 3090. - I will recv. the card tomorrow. But I have bought the card because I have some expectations to AMD's technology advantages in some situations. And also I really didn't needed 24GB Vram yet. I think 16GB makes sense as many games I played on the 3090 stored more than 10GB into buffer in some games. Other games like AC Valhalla trips around 9700ish so it's very close to the 10GB "limit" of a 3080. Also, the scalper prices for NVIDIA cards are ridiculous and getting a card MSRP these days are simply luck


----------



## Godhand007 (May 31, 2021)

Wrathier said:


> Happy and happy... you know that is big words. I wasn't 100 percent happy with the performance of the 3090 even overclocked and I think that Image scaling with Trixx https://www.sapphiretech.com/en/software and AMD Images sharpening will make up for the fps I lose in some games. I don't know exactly how it performs at the same settings as the 3090. - I will recv. the card tomorrow. But I have bought the card because I have some expectations to AMD's technology advantages in some situations. And also I really didn't needed 24GB Vram yet. I think 16GB makes sense as many games I played on the 3090 stored more than 10GB into buffer in some games. Other games like AC Valhalla trips around 9700ish so it's very close to the 10GB "limit" of a 3080. Also, the scalper prices for NVIDIA cards are ridiculous and getting a card MSRP these days are simply luck


What were you unhappy with (OC 3090)? Can you give some specific examples? 

Sharpening tech is avaiable on Nvidia as well. Also, 24GB VRAM is not why _Gamers _buy RTX 3090, it's for the best 4K performance.

" I have some expectations to AMD's technology advantages in some situations": That is one of the reasons I had for choosing AMD as well. When the performance between two companies is so close, even minor advantages can become meaningful. Take the example of SAM vs Re-Sizable bar, AMD benefits more from SAM than Nvidia from the Re-Sizable bar.


----------



## Wrathier (May 31, 2021)

Godhand007 said:


> What were you unhappy with (OC 3090)? Can you give some specific examples?
> 
> Sharpening tech is avaiable on Nvidia as well. Also, 24GB VRAM is not why _Gamers _buy RTX 3090, it's for the best 4K performance.
> 
> " I have some expectations to AMD's technology advantages in some situations": That is one of the reasons I had for choosing AMD as well. When the performance between two companies is so close, even minor advantages can become meaningful. Take the example of SAM vs Re-Sizable bar, AMD benefits more from SAM than Nvidia from the Re-Sizable bar.


Yeah that is what I was not happy with. In AC Valhalla the fps before re-sizeable bar could dib below 60 for a second or two fucking up G-Sync on the Odyssey G9. Since it's also has FreeSync Premium Pro I hope it will be a good experience.


----------



## Felix123BU (Jun 1, 2021)

Just watched AMD's keynote, FSR looks quite good, lets see how and when it gets implemented in actual games, I hope Cyberpunk would be between the first ones


----------



## Space Lynx (Jun 1, 2021)

Felix123BU said:


> Just watched AMD's keynote, FSR looks quite good, lets see how and when it gets implemented in actual games, I hope Cyberpunk would be between the first ones



I feel bad for review sites, there are like 5 different FSR settings to choose from... so reviewing game performance with FSR is going to be more time consuming than DLSS which only has two settings lol







@Mussels this is for you to enjoy whenever you like. since you commented about it LOL

LOOK AT THE GLORY OF AMD!!!!


----------



## Kabouter Plop (Jun 1, 2021)

Timing is perfect cos im gonna buy a samsung g9 odyssey ultrawide 5120x1440 240 hz screen by the end of june


----------



## Space Lynx (Jun 1, 2021)

Kabouter Plop said:


> Timing is perfect cos im gonna buy a samsung g9 odyssey ultrawide 5120x1440 240 hz screen by the end of june



if I were spending that kind of money i'd just get the 48" OLED and sit further back.  any reason why you want the g9 over an OLED?  just curious


----------



## GamerGuy (Jun 1, 2021)

I game on a first gen Super Ultra Wide, a Samsung 32:9 49" 3840x1080 144Hz Freesync 2 monitor (basically 2x 27" 144Hz 1080P monitors stuck together) and I love it. That extra width in games makes it so much more immersive for me, I'd moved from 16:9 to 21:9 years ago, and I don't think I can go back to just 16:9 now.....


----------



## Kabouter Plop (Jun 1, 2021)

lynx29 said:


> if I were spending that kind of money i'd just get the 48" OLED and sit further back.  any reason why you want the g9 over an OLED?  just curious



Because i can always game at 3440x1440 or 2560x1440 still with perfect scaling and benefit of 240hz rather then lower refresh rate with games that do not support super ultrawide or ultrawide.
its the only 240hz ultrawide on market atm


----------



## Felix123BU (Jun 1, 2021)

lynx29 said:


> I feel bad for review sites, there are like 5 different FSR settings to choose from... so reviewing game performance with FSR is going to be more time consuming than DLSS which only has two settings lol
> 
> 
> 
> ...


I tried to do the math for a Hardware Unboxed 32 game roundup with each FSR setting ....... 

And AMD be like, ok, Nvidia's DLSS has 2-3 setting, we shall have 4-5, moar is better


----------



## GamerGuy (Jun 1, 2021)

Felix123BU said:


> I tried to do the math for a Hardware Unboxed 32 game roundup with each FSR setting .......


Yeah, Hardware Unboxed's Steve's gonna be quite the busy bee!


----------



## bulletoftime (Jun 3, 2021)

Hi fellow RX 6000 series owners....I have a slightly unusual question: did you ever run the 3DMark DirectX Raytracing feature test?

I have an Asus TUF RX 6900 XT OC. With 21.4.1 drivers, I see a very low score in the 3DMark DirectX Raytracing feature test: *27.41 FPS*. I see something abnormal in the monitoring logs: the GPU Memory Clock Frequency seems to drop down quite a lot and is very unstable. Only see this kind of behavior in DirectX Raytracing feature test.

The issue is fixed if I roll back to 21.2.3. With 21.2.3 drivers I see *32.18 FPS*.

I do see something slightly abnormal in the monitoring logs: the GPU Memory Clock Frequency seems to fluctuate quite a bit. Only see this kind of behavior in this raytracing feature test.

*All drivers after 21.2.3 seem to have the problem!*

Have any of you seen this issue?

My PC specs:
GPU: Asus TUF RX 6900 XT OC (Power Slider: +15%, Max Frequency: 2600 MHZ).
CPU: AMD Ryzen 7 3700X (PBO ON)
Motherboard: Asus Strix B550-F Gaming (WiFi)
BIOS Version: 1804
RAM: 32GB G.Skill Trident Z Neo 3800MHZ CL16 (2x16GB)
PSU: Seasonic FOCUS GX-1000 1000W 80+ GOLD FULLY MODULAR
Operating System & Version: WINDOWS 10 PRO 19043
GPU Drivers: Adrenalin Driver Version: 21.4.1


----------



## Kabouter Plop (Jun 3, 2021)

Suspect my memory is no longer stable with 6900 XT before i ran soc voltage 1.175 no load line calibration and it passed 1000% 2 times easily but now i had Unexpected Kernel Mode Trap bsod caused by AMDRSServ.exe which i think is also responsible for recording which if set to record directly into memory at the time i was playing dyson sphere program and was also messing with hardware monitoring related things altho i am starting to suspect its memory.
if i run soc voltage at stock default no changes i sometimes error instantly within 5 minutes sometimes it takes hours, before i raised it to 1.175 and got a 1000% pass for 2 times


----------



## las (Jun 3, 2021)

Felix123BU said:


> Just watched AMD's keynote, FSR looks quite good, lets see how and when it gets implemented in actual games, I hope Cyberpunk would be between the first ones



Can you explain to me, what looked good? I was disappointed about the massive blur .. Most video's I have seen on youtube comes to the conclusion that it's worse than DLSS 1.0 and nothing but a spatial upscaler.

When you are able to spot blur in a heavily compressed video, it will be even worse when you actually use it.

DLSS 1.0 marketing video looked great, yet it sucked, so my expectations for FSR have been lowered greatly.









						No, AMD's FSR (FidelityFX Super Resolution) Is Not A DLSS Alternative, And Here Is Why You Should Care
					

AMD recently rolled out an amazing new framework called FidelityFX Super Resolution (FSR) and in a stroke of genius and commitment to open standards, made it available for both Radeon and GeForce GPUs (editor's note: see the update at the end of the article). While the feature will certainly...




					wccftech.com


----------



## ratirt (Jun 3, 2021)

las said:


> Can you explain to me, what looked good? I was disappointed about the massive blur .. Most video's I have seen on youtube comes to the conclusion that it's worse than DLSS 1.0 and nothing but a spatial upscaler.
> 
> When you are able to spot blur in a heavily compressed video, it will be even worse when you actually use it.
> 
> ...


The 1060 screenshot looks bad. No doubt about it but the video with the 6000 series presenting the godfall game with FSR modes look good to me. 
Too bad, this video has been omitted in WCCFtech's article.


----------



## las (Jun 3, 2021)

ratirt said:


> The 1060 screenshot looks bad. No doubt about it but the video with the 6000 series presenting the godfall game with FSR modes look good to me.
> Too bad, this video has been omitted in WCCFtech's article.


Yeah on the STILLs, where they just showed the percentages. I bet both sides are running at native here.

If you skip to the actual gameplay video, you can clearly see smearing and blur on the right side which is FSR enabled, looks like motion blur is applied and that is the huge problem with spatial upscaling (which is nothing new for pc gamers)

I will get highly surprised if FSR is even remotely close to DLSS 2.0 in terms of performance and visuals - I expect DLSS 1.0 results or worse tbh

Watch this, FSR has issues


----------



## nguyen (Jun 3, 2021)

las said:


> Yeah on the STILLs, where they just showed the percentages. I bet both sides are running at native here.
> 
> If you skip to the actual gameplay video, you can clearly see smearing and blur on the right side which is FSR enabled, looks like motion blur is applied and that is the huge problem with spatial upscaling (which is nothing new for pc gamers)
> 
> ...











Doesn't look good with RX6800XT at 4k either


----------



## Godhand007 (Jun 3, 2021)

nguyen said:


> View attachment 202654
> 
> View attachment 202655
> 
> Doesn't look good with RX6800XT at 4k either


Now compare them with DLSS 1.0.


----------



## GamerGuy (Jun 3, 2021)

I'd rather wait for the actual release of FSR and have well regarded tech sites like TPU, guru3d, LTT, HU, etc to review and opine about FSR, not some YT presentation by someone I don't recognize at all. Analyzing PQ based on some vid (which should be compressed) is an instant fail for me when I watched that vid, I cannot take anyone who forms an opinion based simply on that. Do I expect PQ/IQ to match DLSS 2.0? No, of course not, but I'm just hoping it'd be 'good enough', hopefully better than DLSS 1.0 but below 2.0. We'll just have to wait and see for FSR to be released to see what trade offs PQ-wise would have to be made.


----------



## nguyen (Jun 3, 2021)

Godhand007 said:


> Now compare them with DLSS 1.0.



DLSS 1.0 was actually not that bad at 4k








						Monster Hunter World DLSS Review - Performance and Quality Analysed | 4K Analysis and benchmarks | Software
					

4K Analysis and benchmarks




					www.overclock3d.net
				




Anyways comparing to DLSS 1.0 is pretty moot, DLSS 2.0 came out 16 months ago. If anyone think FSR is worth the performance uplift/Image quality trade off when they were complaining about DLSS 2.0 then they are just being hypocritical


----------



## Godhand007 (Jun 3, 2021)

nguyen said:


> DLSS 1.0 was actually not that bad at 4k
> 
> 
> 
> ...


I am not complaining about DLSS. It is good, especially for mid-range and entry-level gamers but FSR would be a one-click FPS boot for those hoards of Fortnite (and similar games) players on a laptop who don't necessarily do a pixel count on each frame.
Also, DLSS(quality) at 1440p (had a 2080ti before) was noticeably worse than native for me.


----------



## turbogear (Jun 3, 2021)

Felix123BU said:


> I will only judge FSR when I see it with my own eyes, at this moment nobody can actually judge how good or bad FSR is because nobody actually has seen it with their own eyes. (besides internal AMD employees and some developers)(and no, seeing a video in a video is not equal to seeing it with your own eyes )
> 
> Anybody who makes judgements now without actually having seen FSR on their own screen is probably biased either negatively or positively, so not to say the word "a fanboy" (either of AMD or of Nvidia).
> 
> Needless to say, fanboy opinions of either AMD or Nvidia have 0 relevance either way


I agree.
I will waite to see it myself before I judge it.

As you said it is always funny how fanboys start trolling around every time.  

I agree AMD is late to the party but let's see how FSR will develop in the future. I also don't think AMD at the moment can challenge DLSS 2.0 but let's see.
This is also first AMD generation to support RTX at all. Let's see how it will develop in next generations of cards.

I mean nobody including me expected anything from RDNA2 as many thought no way AMD will ever catch up with Nvidia but here we are with 6900XT trading blows with 3090 on non RTX tests. 

Many then said Nvidia will bring 3080Ti to blow AMD away but according to review from TPU it is not much faster than 6900XT.
It is pity that W1zzard did not include his 6900XTXH card to compare to 3080Ti.









						ASRock Radeon RX 6900 XT OC Formula Review - This Card is Fast
					

The ASRock Radeon RX 6900 XT OC Formula is built using AMD's new Navi 21 XTXH chip, which runs much higher clocks than what the regular RX 6900 XT can achieve. In our testing, this is the first AMD card in a long time to beat NVIDIA's current-generation flagship, the RTX 3090.




					www.techpowerup.com
				












						NVIDIA GeForce RTX 3080 Ti Founders Edition Review
					

NVIDIA's GeForce RTX 3080 Ti is the green team's answer to AMD's recent Radeon launches. In our testing, this 12 GB card basically matches the much more expensive RTX 3090 24 GB in performance. The compact dual-slot Founders Edition design looks gorgeous and is of amazing build quality.




					www.techpowerup.com
				




It would be really interesting though if 3080Ti will sell for 1200$. If that happens then at least gamers have one option to get a card that is not crazily overpriced at the moment. 

6900XT is selling for almost 2000€ and 3090 for near to 3000€.


----------



## Butanding1987 (Jun 4, 2021)

All they’re saying is wait for the real thing, then judge for yourself. Everything is hearsay for now.


----------



## turbogear (Jun 4, 2021)

las said:


> Yeah, but I do, in rasterization my 3080 beats 6800XT and sometimes even 6900XT, for 700 dollars
> 
> When I enable DLSS 2.0 in supported games, it just leaves them in the dust instead - Or use RT
> 
> ...


Please stop trolling on RDNA2 owners thread and trying to bring this thread into NVIDIA vs AMD flame war. 

We really don't care about DLSS 2.0 and I never tried or compared it to DLSS 1.0.
I just simply don't care about it.

I like to game at 2k with high frame rates and for that my 6900XTU is a great card having frame rate in range of 100-110 in Cyberpunk on Ultra settings.


----------



## the54thvoid (Jun 4, 2021)

Official warning. This is a thread for 6000 series owners. Not to argue green versus red.

Infractions are waiting for those that are determined to pee in somebody else's pool.

Stay on topic. Mass of deletions made.

Reply ban given.


----------



## ratirt (Jun 4, 2021)

turbogear said:


> Please stop trolling on RDNA2 owners thread and trying to bring this thread into NVIDIA vs AMD flame war.
> 
> We really don't care about DLSS 2.0 and I never tried or compared it to DLSS 1.0.
> I just simply don't care about it.
> ...


I go 4k CP2077 with fidelityFX and it's OK. Sometimes I drop one detail to make it run faster  The idea of going 1440p for whatever reason and resigning from 4k is not for me. I'm simply to 4K admirer to drop 1440p if I really don't have to.


----------



## turbogear (Jun 4, 2021)

ratirt said:


> I go 4k CP2077 with fidelityFX and it's OK. Sometimes I drop one detail to make it run faster  The idea of going 1440p for whatever reason and resigning from 4k is not for me. I'm simply to 4K admirer to drop 1440p if I really don't have to.


Well I did not go 4k until now as at 4k and 27 inch monitor which ideal fit for my desk is not anymore best for my older eyes especially for desktop work. 
I need anyways to update my eye glasses.


----------



## ratirt (Jun 4, 2021)

turbogear said:


> Well I did not go 4k until now as at 4k and 27 inch monitor which ideal fit for my desk is not anymore best for my older eyes especially for desktop work.
> I need anyways to update my eye glasses.


Are you serious? I got a 27 inch 4k 144hz and I couldn't be more happier with it  You know, if you can't see good enough just bump the font size  
I ditched my glasses a year ago and I haven't even noticed and yet I'm not a young one either  My LG GN950 is just awesome


----------



## Felix123BU (Jun 4, 2021)

ratirt said:


> I go 4k CP2077 with fidelityFX and it's OK. Sometimes I drop one detail to make it run faster  The idea of going 1440p for whatever reason and resigning from 4k is not for me. I'm simply to 4K admirer to drop 1440p if I really don't have to.


I was thinking about 4K, but then I got a UW monitor, and I don't want to go back to a non UW 
Cyberpunk is the only game that I want to see the FSR on my 6800XT, its the only game that with RT on goes to 30 FSP at my resolution, and I mean only reflections, shadows and global illumination are not something I am impressed by in Cyberpunk, but RT reflections does look better.


----------



## ratirt (Jun 4, 2021)

Felix123BU said:


> I was thinking about 4K, but then I got a UW monitor, and I don't want to go back to a non UW
> Cyberpunk is the only game that I want to see the FSR on my 6800XT, its the only game that with RT on goes to 30 FSP at my resolution, and I mean only reflections, shadows and global illumination are not something I am impressed by in Cyberpunk, but RT reflections does look better.


have you tried FidelityFx and maybe drop details a bit?
Currently I'm playing GodFall and Horizon Zero Dawn. Well and EU truck simulator 2  I don't need to do anything with these games to run with higher FPS though. 
FSR with CP2077 outta be interesting. I would like to see what FPS I would get with it. FidelityFX works good with this you may as well try it. Suggesting not to go below 70%-75%.


----------



## turbogear (Jun 4, 2021)

ratirt said:


> Are you serious? I got a 27 inch 4k 144hz and I couldn't be more happier with it  You know, if you can't see good enough just bump the font size
> I ditched my glasses a year ago and I haven't even noticed and yet I'm not a young one either  My LG GN950 is just awesome


I need to test it. 
In the past I did not had strong enough GPU for 4k  but since I have 6900XT I thought about it but the electronic stores where closed due to Covid to go and check some 4k monitors in real life.
My eyes are a little sensitive.
Few years ago they were really good but now at 45 age the focus distance for reading small text is getting larger and I need working glasses for monitors.   

I am not always happy with increasing DPI. I have that done on my work laptop but some of the tools that I use at work does not look that nice on higher DPI.
Browser for example is no issue.


----------



## ratirt (Jun 4, 2021)

turbogear said:


> I need to test it.
> In the past I did not had strong enough GPU for 4k  but since I have 6900XT I thought about it but the electronic stores where closed due to Covid to go and check some 4k monitors in real life.
> My eyes are a little sensitive.
> I am not always happy with increasing DPI. I have that done on my work laptop but some of the tools that I use at work does not look that nice on higher DPI.
> Browser for example is no issue.


I can't imagine to go back to 1440p or lower. For me 4K is a game changer and everything i've  purchased recently, is due to that fact. New CPU new graphics, new screen. All for 4K gaming and so far I couldn't be more pleased and happy with my choice  My 6900XT works perfectly fine and the 5800X i purchased last week, kick the 6900XT in some games to a whole new level


----------



## sepheronx (Jun 4, 2021)

ratirt said:


> I can't imagine to go back to 1440p or lower. For me 4K is a game changer and everything i've  purchased recently, is due to that fact. New CPU new graphics, new screen. All for 4K gaming and so far I couldn't be more pleased and happy with my choice  My 6900XT works perfectly fine and the 5800X i purchased last week, kick the 6900XT in some games to a whole new level


I spent all my money on the recent purchases. So I can't afford a 4k monitor.  Any suggestions I should keep an eye on?  My 6800xt is not even stressed at all at 1080p.


----------



## ratirt (Jun 4, 2021)

sepheronx said:


> I spent all my money on the recent purchases. So I can't afford a 4k monitor.  Any suggestions I should keep an eye on?  My 6800xt is not even stressed at all at 1080p.


If you aim for 4k then the best solution as of today would be the LG GN950. There is a new refresh and there will be more monitors 4k 144Hz with low latency but believe it or not, buying this monitor was a struggle. When it had hit shelves I simply was hunting for it. I struck in 3 stores and had to cancel the order. At some point I had ordered this LGGN950 in 4 stores and only one confirmed and send it. Not sure how it looks now to be honest. 4K is the way to go for me and with your 6800 XT you can go no wrong with 4K. Unless you wanna go for 1440p or maybe something like Ultra Wide?


----------



## zer0signal (Jun 4, 2021)

Hey everyone, I have been reading a lurking on this thread for a few weeks now.  I wanted to know if I have reached the max I can with this card.

I purchased a Liquid Devil 6900xt, and have it pared with a R9 5900x, 1000w PSU (Asus Strix "Seasonic").

In game  (Dx12, Vulkan, DX11) I can keep my sliders at min 2600, max 2700, +15% (max mem, w/ fast timing) 58c/78c, and basically achieve effective clock of about 2650mhz (any higher I will crash the game).

I cannot achieve this in Timespy or Timespy Extreme, 2625 (max slider) is all i can do reliably without driver crash to desktop (or black screen).
In Port Royal - 2650 (max)

I have tried MPT (375/375) +15% (~405w at peak), (no mpt ~345w peak), but it seems not matter how much extra power I give, I cannot get past 2625 max slider for TimeSpy.


I have pulled the water block off and found 2 of the screws holding the block to the GPU were barely tight (Noticed Hotspots around 110 in Adia64). 
My 3 year old son could probably have removed them with his fingers...  

So I completely pulled the block, and repasted "Kryonaut", and re-set the block with much more tension. Hotspots now about 88c in Aida for GPU Stress.

I feel like this is the max this card can do in Timespy, w/o a voltage increase. 
 Would you guys think the same, or is there something else I'm missing?

Please don't take this as complaining (I really love this GPU (coming from a 2080 Super)), the rasterization of this card is phenomenal!
I can game in 4k with it being above 110FPS in most games (no RT)

I am just trying to see what the maximum I can push this


----------



## sepheronx (Jun 4, 2021)

ratirt said:


> If you aim for 4k then the best solution as of today would be the LG GN950. There is a new refresh and there will be more monitors 4k 144Hz with low latency but believe it or not, buying this monitor was a struggle. When it had hit shelves I simply was hunting for it. I struck in 3 stores and had to cancel the order. At some point I had ordered this LGGN950 in 4 stores and only one confirmed and send it. Not sure how it looks now to be honest. 4K is the way to go for me and with your 6800 XT you can go no wrong with 4K. Unless you wanna go for 1440p or maybe something like Ultra Wide?


1440p ultrawide is fine too.


----------



## Felix123BU (Jun 4, 2021)

ratirt said:


> have you tried FidelityFx and maybe drop details a bit?
> Currently I'm playing GodFall and Horizon Zero Dawn. Well and EU truck simulator 2  I don't need to do anything with these games to run with higher FPS though.
> FSR with CP2077 outta be interesting. I would like to see what FPS I would get with it. FidelityFX works good with this you may as well try it. Suggesting not to go below 70%-75%.


In Cyberpunk without RT I get around 70-80 FPS with all settings maxxed out (basically all Ultra) at 3440X1440. I tried FidelityFX with 50 to 75% resolution scaling with RT reflection, but it looks quite crappy compared to the native resolution. The game looks fantastic even without RT. That's why I am curious if AMD and CD Project will implement it in Cyberpunk, if it looks at least 90% as the FSR presentation in Godfall, it would look tons better than Fidelity FX with resolution scaling, and actually make it playable without loosing a ton of detail  
But then again, that FSR presentation is just that, a presentation, how actual FSR will look I have no idea.


As said, that's the only game I played that with RT goes bonkers, have played Shadow of The Tomb Raider with RT High and Ultra (not that it looks better, just because I could ), RT High gets me around 80 FPS, Ultra around 60. Horizon Zero Dawn gets 100-110 FPS

Another one that could probably need FSR in the future is The Witcher 3 remastered with RT, have a feeling since its and Nvidia sponsored game that it wont run to great on AMD with RT, RT(hair)works again 



zer0signal said:


> Hey everyone, I have been reading a lurking on this thread for a few weeks now.  I wanted to know if I have reached the max I can with this card.
> 
> I purchased a Liquid Devil 6900xt, and have it pared with a R9 5900x, 1000w PSU (Asus Strix "Seasonic").
> 
> ...


Not an expert on the 6900XT, but one thing I noticed that might be an issue for you, you say  "max mem, w/ fast timing", its quite common that maxxing the mem clock will actually degrade performance since the vram chips on most 6800XT and 6900XT are starting to cause errors over 2100mhz with fast timing and can ever cause crashes (especially in 3Dmark, happened to me) if mem is pushed to hard. I would try to set the mem at max 2100mhz and retest again with GPU frequency changes.

Is the Liquid Devil 6900 an XT or an XTU? If its an XT "only", 2.65ghz ingame is quite good, some here can attest to that 

The only thing you could really push more is replacing the Kryonaut with Liquid Metal, does wonders for temps, but is not for everybody since it can be tricky and need a leap of faith so to say 



ratirt said:


> I can't imagine to go back to 1440p or lower. For me 4K is a game changer and everything i've  purchased recently, is due to that fact. New CPU new graphics, new screen. All for 4K gaming and so far I couldn't be more pleased and happy with my choice  My 6900XT works perfectly fine and the 5800X i purchased last week, kick the 6900XT in some games to a whole new level


That's funny, I had the same feeling of my 5800X kicking my 6800XT to a whole new level  



sepheronx said:


> 1440p ultrawide is fine too.


6800XT and 2k Ultrawide are a match made in heaven


----------



## turbogear (Jun 4, 2021)

ratirt said:


> If you aim for 4k then the best solution as of today would be the LG GN950. There is a new refresh and there will be more monitors 4k 144Hz with low latency but believe it or not, buying this monitor was a struggle. When it had hit shelves I simply was hunting for it. I struck in 3 stores and had to cancel the order. At some point I had ordered this LGGN950 in 4 stores and only one confirmed and send it. Not sure how it looks now to be honest. 4K is the way to go for me and with your 6800 XT you can go no wrong with 4K. Unless you wanna go for 1440p or maybe something like Ultra Wide?


Damn LG GN950 look interesting but also quite expensive.   
Most known stores like Caseking.de are selling it in range of 1000€.


----------



## zer0signal (Jun 4, 2021)

Felix123BU said:


> Not an expert on the 6900XT, but one thing I noticed that might be an issue for you, you say "max mem, w/ fast timing", its quite common that maxxing the mem clock will actually degrade performance since the vram chips on most 6800XT and 6900XT are starting to cause errors over 2100mhz with fast timing and can ever cause crashes (especially in 3Dmark, happened to me) if mem is pushed to hard. I would try to set the mem at max 2100mhz and retest again with GPU frequency changes.
> 
> Is the Liquid Devil 6900 an XT or an XTU? If its an XT "only", 2.65ghz ingame is quite good, some here can attest to that
> 
> The only thing you could really push more is replacing the Kryonaut with Liquid Metal, does wonders for temps, but is not for everybody since it can be tricky and need a leap of faith so to say



I don't think I tried a higher core OC with Mem Lowered.  I did do incremental steps with the mem in TSpy, and noticed a gain on each tick up to the max. I'll give it a whirl with mem lowered.

Yes its just an "XT", I am very satisfied with 2.65ghz in game as well; still looking to see if i can squeeze more out 

I'll hold off on the Liquid Metal for now, may revisit in the future!

Thanks for the info!


----------



## turbogear (Jun 4, 2021)

zer0signal said:


> Hey everyone, I have been reading a lurking on this thread for a few weeks now.  I wanted to know if I have reached the max I can with this card.
> 
> I purchased a Liquid Devil 6900xt, and have it pared with a R9 5900x, 1000w PSU (Asus Strix "Seasonic").
> 
> ...


Welcome to the club.  
What you mentioned about ocing and Timespy seems to be fit in general. Not many 6900XT can be pushed much higher expect if you are lucky and got a golden chip.
2650MHz in games is decent.
I have 6900XT Liquid Devil Ultimate and this is stable for 2550MHz min and 2750MHz max @1175mV in Time Spy. So gives me around 2700MHz in games.
My 6900XTU which is XTXH chip can run upto 2780MHz in games with 2850@1175mV (max voltage possible 1200mV) but that setting is not stable in 3DMARK so I don't use it on daily gaming as well.

I was also not very satisfied with temperatures on original paste but mine did not reach 110°C but maxed at 92°C.
Now I am on Liquid Metal and hotspot temperatures in games hover around 70°C.
In stress tests with 400W Core power setting, hotspot temperatures remains below 78°C.  

Check though if your memory is stable at 2150MHz.
As @Felix123BU mentioned many RDNA2 don't like much higher than 2100MHz on memory.
I am not ocing memory at all at the moment as performance is already quite good even without memory oc.


----------



## zer0signal (Jun 4, 2021)

turbogear said:


> Welcome to the club.
> What you mentioned about ocing and Timespy seems to be fit in general. Not many 6900XT can be pushed much higher expect if you are lucky and got a golden chip.
> 2650MHz in games is decent.
> I have 6900XT Liquid Devil Ultimate and this is stable for 2550MHz min and 2750MHz max @1175mV in Time Spy. So gives me around 2700MHz in games.
> ...


Thanks for the info,  from what I was reading I pretty much figured that what I have is maxed out     Still overall very happy with the card, just couldn't believe how lose the 2 screws on the outside of the GPU hold down were..  Poor QC, but i'm sure they're are just trying to pump as many of these out as they can right now.  

I will probably attempt a liquid metal in the future. But right now, I can't complain 




Felix123BU said:


> its quite common that maxxing the mem clock will actually degrade performance since the vram chips on most 6800XT and 6900XT are starting to cause errors over 2100mhz with fast timing and can ever cause crashes (especially in 3Dmark, happened to me) if mem is pushed to hard. I would try to set the mem at max 2100mhz and retest again with GPU frequency changes.


I did attempt just now dropping the Mem to defaults, and still no dice.  
Ah well, I'm pretty sure I am at the max of this chip unless someone finds away to unlock the voltage to 1.2


----------



## Felix123BU (Jun 4, 2021)

zer0signal said:


> Thanks for the info,  from what I was reading I pretty much figured that what I have is maxed out     Still overall very happy with the card, just couldn't believe how lose the 2 screws on the outside of the GPU hold down were..  Poor QC, but i'm sure they're are just trying to pump as many of these out as they can right now.
> 
> I will probably attempt a liquid metal in the future. But right now, I can't complain
> 
> ...


oooo yes, I would very very much love if in future I could adjust the chip voltage, even a bit higher, its a pity that AMD does not allow at least a bit of leeway, even if I suspect that the chips are pushed to the max, but still, maybe some chips still have some extra in them, as you said, I am not complaining (not a lot ), I am also super pleased with my card, I really did not expect the 6000 series to be so strong


----------



## turbogear (Jun 4, 2021)

Felix123BU said:


> oooo yes, I would very very much love if in future I could adjust the chip voltage, even a bit higher, its a pity that AMD does not allow at least a bit of leeway, even if I suspect that the chips are pushed to the max, but still, maybe some chips still have some extra in them, as you said, I am not complaining (not a lot ), I am also super pleased with my card, I really did not expect the 6000 series to be so strong


Higher voltage does not always help.  
Mine can be set to 1.2V but higher voltage and power more than 400W does not help to stabilize the card above 2750MHz setting in 3DMark benchmarks.


----------



## Felix123BU (Jun 4, 2021)

turbogear said:


> Higher voltage does not always help.
> Mine can be set to 1.2V but higher voltage and power more than 400W does not help to stabilize the card above 2750MHz setting in 3DMark benchmarks.


Yeah, that's why I said I guess most chips are already pushed to the max by default, but there might be exceptions   

Speaking of AMD's FSR, I am playing Xenoblade Cronicles X on CEMU emulator (its weekend, I have some free time ) , its a Wii U game (which is an old and weak console compared to modernish PC's), and CEMU can use either Bilinear, Bicubic, Hermite or Nearest Neighbor upscaling (so simple traditional upscaling tech) (since the Wii U supported max 1080p), and the upscaling looks muuuuuuch better than what AMD used in the initial FidelityFX resolution modifier.

I am a bit confused, why did they, or Nvidia for that matter, did not use simple upscaling until now, the consoles had it, both PS4 and Xbox One, so why was this not used in PC??? (actually I know why, it would have stopped a lot of people from upgrading )
The game looks very good on my UW monitor being upscaled to 3440x1440, unexpectedly good considering it was designed for the Wii U with very limited graphic capabilities. Have not had the "pleasure" of seeing DLSS 1.0 with my own eyes, but from the multitude of videos on YouTube, the CEMU upscaling is at least on par with DLSS 1.0, if not better, less smearing.....so if some guys can create such a solution in their own free time, why could AMD or Nvidia not do it until recently?

Anyway, its late here and its time for conspiracy theories


----------



## Godhand007 (Jun 5, 2021)

turbogear said:


> I agree.
> I will waite to see it myself before I judge it.
> 
> As you said it is always funny how fanboys start trolling around every time.
> ...


6900XT is faster than a 3080ti on all resolutions below ~4K. This is one of the things that makes it difficult to say which card is faster this generation. AMD is too close to be defined as a loser.



turbogear said:


> Please stop trolling on RDNA2 owners thread and trying to bring this thread into NVIDIA vs AMD flame war.
> 
> We really don't care about DLSS 2.0 and I never tried or compared it to DLSS 1.0.
> I just simply don't care about it.
> ...


_100-110 in Cyberpunk on Ultra settings; _Performance is usually a tier lower than that around 70-90 FPS with all settings maxed out except Psycho SSAO on my OCed 6900XT.



zer0signal said:


> Thanks for the info,  from what I was reading I pretty much figured that what I have is maxed out     Still overall very happy with the card, just couldn't believe how lose the 2 screws on the outside of the GPU hold down were..  Poor QC, but i'm sure they're are just trying to pump as many of these out as they can right now.
> 
> I will probably attempt a liquid metal in the future. But right now, I can't complain
> 
> ...


Listening to you guys on water/ XTX chips, I think I lucked out with my reference 6900XT with 2665 MHz (>2600 MHz in-game) and full memory OC on air. These clocks are multiple Timespy and other 3d mark stress tests stable for hours. I can play with 2700MHz in-game frequency without crashes but I like to have rock stable settings for my GPU.



Felix123BU said:


> Another one that could probably need FSR in the future is The Witcher 3 remastered with RT, have a feeling since its and Nvidia sponsored game that it wont run to great on AMD with RT, RT(hair)works again


Yeaa, I agree. Don't expect The Witcher 3 to get any better. Here are some of the recent results for it (modded game). And that is without Hariworks on.


----------



## Garlic (Jun 5, 2021)

Did the new radeon drivers do anything to stability? Before I updated the drivers I could only oc my 6700xt to around 2720, after the update I could do 2820 and it never crashes now.


----------



## Godhand007 (Jun 5, 2021)

bulletoftime said:


> Hi fellow RX 6000 series owners....I have a slightly unusual question: did you ever run the 3DMark DirectX Raytracing feature test?
> 
> I have an Asus TUF RX 6900 XT OC. With 21.4.1 drivers, I see a very low score in the 3DMark DirectX Raytracing feature test: *27.41 FPS*. I see something abnormal in the monitoring logs: the GPU Memory Clock Frequency seems to drop down quite a lot and is very unstable. Only see this kind of behavior in DirectX Raytracing feature test.
> 
> ...


I got a score of 28.70 with the latest drivers. My mem clocks were completely stable though.


----------



## Kabouter Plop (Jun 5, 2021)

Can soc voltage lead to pci-e instability ? have a feeling my soc voltage is higher with 6900 XT then i used to have with old gpu, now i lowered it quite a bit and overwatch stops crashing, i may try lower it even more after i test memory properly to see if i can have it stable at lower voltages, im not woried about temps my rig is fully watercooled, sorry if this is wrong place to ask.


----------



## ratirt (Jun 5, 2021)

turbogear said:


> Damn LG GN950 look interesting but also quite expensive.
> Most known stores like Caseking.de are selling it in range of 1000€.


I bought it for $1000 and it was cheap  Just checked prisjakt.no. No GN950 available :/
You have it listed but it does not work the listed price you buy for it in Norway 

BTW. My CPU rose to 70c with the GodFall. Damn what a great game. My nephews tried it today. RT on and the FPS damn dropped to 56 :O WTF.
The further you go the more demanding it is. But I gotta say. The game is so damn optimized like few I know. You played this? 
I got a 5800X so... wanna battle who's got a better score in a bench with your 6800XT ? 


Godhand007 said:


> 6900XT is faster than a 3080ti on all resolutions below ~4K. This is one of the things that makes it difficult to say which card is faster this generation. AMD is too close to be defined as a loser.
> 
> 
> _100-110 in Cyberpunk on Ultra settings; _Performance is usually a tier lower than that around 70-90 FPS with all settings maxed out except Psycho SSAO on my OCed 6900XT.
> ...


Don't say NVidia here. It will bring all the trolls to talk about it. Try to avoid it please.  
Thank you


----------



## mama (Jun 5, 2021)

turbogear said:


> Damn LG GN950 look interesting but also quite expensive.
> Most known stores like Caseking.de are selling it in range of 1000€.


Look at the LG CX. Works great for 4K 120Hz gaming. And doubles as a TV.


----------



## Space Lynx (Jun 5, 2021)

mama said:


> Look at the LG CX. Works great for 4K 120Hz gaming. And doubles as a TV.



agreed.  i really hope in a couple years we have LG 40" OLED 120hz 4k models for around $899.  probably won't get that lucky... but I can dream.


----------



## turbogear (Jun 6, 2021)

ratirt said:


> I bought it for $1000 and it was cheap  Just checked prisjakt.no. No GN950 available :/
> You have it listed but it does not work the listed price you buy for it in Norway
> 
> BTW. My CPU rose to 70c with the GodFall. Damn what a great game. My nephews tried it today. RT on and the FPS damn dropped to 56 :O WTF.
> ...


I don't own GodFall.
Currently I am playing Cyberpunk.
I own 6900XTU Liquid Devil Ultimate card now. If you remember the 6800XT got destroyed by ESD.
My 5800X is now going to my son's computer and I own 5900X.


----------



## ratirt (Jun 7, 2021)

turbogear said:


> I don't own GodFall.
> Currently I am playing Cyberpunk.
> I own 6900XTU Liquid Devil Ultimate card now. If you remember the 6800XT got destroyed by ESD.
> My 5800X is now going to my son's computer and I own 5900X.


GodFall is a cool game and it does utilize CPU and GPU well. 
I remember now the 6800XT got ruined but I didn't know you got the Ultimate version. 
How far were you able to push the Ultimate version?


----------



## turbogear (Jun 7, 2021)

ratirt said:


> GodFall is a cool game and it does utilize CPU and GPU well.
> I remember now the 6800XT got ruined but I didn't know you got the Ultimate version.
> How far were you able to push the Ultimate version?


It is running at 2750MHz@1175mV 415W Core Power and 400A TDC. So in games frequency around 2700MHz.
I was unable to get it until now stable in 3DMARK benchmarks higher than that.
In Heavon I can run it at 2850MHz@1185mV for hours without crash but that setting is not really stable in heavy stress tests like Time Spy.
It could work in games but I don't like to run a setting that is not 100% stable in stress tests.  
I did not try with higher than 415W Core power. Maybe that could make it stable at 2850MHz but I don't like to push it to it's power limits.


----------



## Butanding1987 (Jun 7, 2021)

415w on an AMD card.


----------



## ratirt (Jun 7, 2021)

turbogear said:


> It is running at 2750MHz@1175mV 415W Core Power and 400A TDC. So in games frequency around 2700MHz.
> I was unable to get it until now stable in 3DMARK benchmarks higher than that.
> In Heavon I can run it at 2850MHz@1185mV for hours without crash but that setting is not really stable in heavy stress tests like Time Spy.
> It could work in games but I don't like to run a setting that is not 100% stable in stress tests.
> I did not try with higher than 415W Core power. Maybe that could make it stable at 2850MHz but I don't like to push it to it's power limits.


That's still a nice OC. I haven't tried to push my card to the limits but I bet it won't hit stable 2750Mhz or the chances are very slim.
Yesterday I did some gaming and tests and I seen the HotSpot around 100deg C. I'm thinking about maybe getting stronger case fans and add one more. That should get the hot air faster and get a fresh air in the case. I'm sure it won't drastically change the temps but it will improve them for sure.
BTW. How much did the card cost in Germany?
I'm looking at the prices in Norway and it doesn't look good. I would have to pay over $3k for the Ultimate card or the Toxic. That's like double what I payed for Red Devil 6900Xt I got.


----------



## Butanding1987 (Jun 7, 2021)

ratirt said:


> That's still a nice OC. I haven't tried to push my card to the limits but I bet it won't hit stable 2750Mhz or the chances are very slim.
> Yesterday I did some gaming and tests and I seen the HotSpot around 100deg C. I'm thinking about maybe getting stronger case fans and add one more. That should get the hot air faster and get a fresh air in the case. I'm sure it won't drastically change the temps but it will improve them for sure.
> BTW. How much did the card cost in Germany?
> I'm looking at the prices in Norway and it doesn't look good. I would have to pay over $3k for the Ultimate card or the Toxic. That's like double what I payed for Red Devil 6900Xt I got.


I'm sure you'll get very far once you put that underwater.


----------



## ratirt (Jun 7, 2021)

Butanding1987 said:


> I'm sure you'll get very far once you put that underwater.


Been thinking about it and maybe I will at some point. Although, I'm thinking if i go liquid, will make a difference in performance noticeable. It will be faster for sure but is it worth it? My concern is to run the card cooler to be honest and if I go liquid that will be my goal.


----------



## turbogear (Jun 7, 2021)

ratirt said:


> That's still a nice OC. I haven't tried to push my card to the limits but I bet it won't hit stable 2750Mhz or the chances are very slim.
> Yesterday I did some gaming and tests and I seen the HotSpot around 100deg C. I'm thinking about maybe getting stronger case fans and add one more. That should get the hot air faster and get a fresh air in the case. I'm sure it won't drastically change the temps but it will improve them for sure.
> BTW. How much did the card cost in Germany?
> I'm looking at the prices in Norway and it doesn't look good. I would have to pay over $3k for the Ultimate card or the Toxic. That's like double what I payed for Red Devil 6900Xt I got.


I had temperatures on hotspot going to 92°C on my 6900XTU Liquid Devil Ultimate before.
I changed from standard paste to Liquid Metal and now hotspot temperatures in games max at 72°C and in stress tests maximum 78°C at 2750MHz@1175mV OC.  
At default setting actually temperatures are much lower.

I paid 2199€ for it back in April.  
It cost me actually 1297€ as I got the money back for my 6800XT. 
Taking that into account it was not a bad deal as regular 6900XT Red Devil currently cost starting 1700€ in Germany.



Butanding1987 said:


> 415w on an AMD card.


Yes but that is only consumed in stress tests. 
The headroom for this card is 525W with 3xPCIe slots. 

In games it actually consumes between 320W-350W at 2750MHz OC.


----------



## Butanding1987 (Jun 7, 2021)

ratirt said:


> Been thinking about it and maybe I will at some point. Although, I'm thinking if i go liquid, will make a difference in performance noticeable. It will be faster for sure but is it worth it? My concern is to run the card cooler to be honest and if I go liquid that will be my goal.


It will definitely be fun, regardless of whether you'd notice real world performance!  



turbogear said:


> The headroom for this card is 525W with 3xPCIe slots.


415w is unacceptable, then. You need to push it further!


----------



## turbogear (Jun 7, 2021)

Butanding1987 said:


> It will definitely be fun, regardless of whether you'd notice real world performance!


On the other hand it is not easy to get a block for 6900XT Red Devil.
@ratirt owns this card as far as I know.

Only Alphacool seems to offer one at the moment.
I know a few persons in Igoreslab forum that have been waiting for months for Alphacool to deliver their order.
Alphacool keeps shifting their delivery date.  

People were saying that they got info from EK support that they will release a block for 6900XT Red Devil in the near future.
Maybe that will come faster than Alphacool pre-orders.


----------



## Kabouter Plop (Jun 7, 2021)

I bought card from AMD knowning i would be certain to get a block, not mention i would be paying lot more from other vendors due web shops being scalpers them self.


----------



## turbogear (Jun 7, 2021)

Butanding1987 said:


> 415w is unacceptable, then. You need to push it further!


You know in current situation where new GPU would cost you a Kidney,  I don't dare to push it further. 



Kabouter Plop said:


> I bought card from AMD knowning i would be certain to get a block, not mention i would be paying lot more from other vendors due web shops being scalpers them self.


Getting card from AMD is not easy.
I tried two times after my 6800XT was damaged.

You need reaction time faster than shooter games otherwise these are gone even before you complete the order. 

Both times I was on  their site in the right time as I was tipped off by a person in which time window AMD will have card on their site and both times it was already sold while I was trying to pay with PayPal.

The Digital River website was extremely slow during that time as it seems some bots are in work in the background. 
Many of these cards in Germany land on eBay for scalper prices.  

My 6800XT was also reference card which was water cooled with EK.
That reminds me that I need to sell my EK Quantum Vector block on eBay as I don't need it anymore.


----------



## Butanding1987 (Jun 7, 2021)

turbogear said:


> On the other hand it is not easy to get a block for 6900XT Red Devil.
> @ratirt owns this card as far as I know.
> 
> Only Alphacool seems to offer one at the moment.
> ...


I was one of the early buyers of the Alphacool water block for the 6800 XT Nitro+. I also had to wait in line. Now, the blocks are available from different manufacturers. It's just a matter of time. But if you're impatient like me, you'd want to be one of the first to get one.


----------



## turbogear (Jun 7, 2021)

Butanding1987 said:


> I was one of the early buyers of the Alphacool water block for the 6800 XT Nitro+. I also had to wait in line. Now, the blocks are available from different manufacturers. It's just a matter of time. But if you're impatient like me, you'd want to be one of the first to get one.


I don't need one but I feel pity for the guys who has been waiting for almost 3 months for delivery of their pre-order. 






						AMD - PowerColor Radeon RX 6900 XT Red Devil
					

Hallo in die Runde! Nachdem ich mir jetzt eine RX 6900 XT gegönnt habe und regelmäßig Igor´s Videos schaue, habe ich jetzt auch versucht meine Grafikkarte "zu prügeln"! Als absoluter OC Anfänger habe ich versucht mich vorsichtig so wie ich glaube es in den Videos verstanden zu haben nach vorne...




					www.igorslab.de


----------



## Felix123BU (Jun 7, 2021)

After 10+ years of Steam I just got the first hardware survey, ofc I participated 
Now steam can not claim anymore that nobody has the 6000 AMD gpuz


----------



## turbogear (Jun 7, 2021)

Felix123BU said:


> After 10+ years of Steam I just got the first hardware survey, ofc I participated
> Now steam can not claim anymore that nobody has the 6000 AMD gpuz


Steam used to be my platform in the past and I own more than 150 games there. 
The last few years I have very rarely had a game through them as many of the games I played did not come from start at Steam anymore.

For example all Battlefield games are from Origen and all Far Cry from Uplay and some at Epic like Borderland 3.
All the older Borderland games I have them all at Steam. 

I should log in from time to time just in case I get HW survey to be maybe first owner of 6900XT at Steam.


----------



## BarbaricSoul (Jun 7, 2021)

Anyone else notice Microcenter is starting to keep RX 6000 cards in-stock? In store purchase only but look- https://www.microcenter.com/search/...by=match&N=4294966937+4294816285&myStore=true


----------



## Felix123BU (Jun 7, 2021)

BarbaricSoul said:


> Anyone else notice Microcenter is starting to keep RX 6000 cards in-stock? In store purchase only but look- https://www.microcenter.com/search/...by=match&N=4294966937+4294816285&myStore=true


That is interesting, even if they are still double of what they should be, but that is a good sign for the market starting to get back slowly, hopefully they will catch dust on shelves at these prices, and then start to be lowered incrementally.

5 in stock, what magic is that   

XFX RX 6800XT MERC 319 AMD Radeon 16GB GDDR6 3xDP: Amazon.de: Computer & Zubehör

I am sooo glad people are not buying them at these prices, its daylight robbery


----------



## turbogear (Jun 8, 2021)

Felix123BU said:


> That is interesting, even if they are still double of what they should be, but that is a good sign for the market starting to get back slowly, hopefully they will catch dust on shelves at these prices, and then start to be lowered incrementally.
> 
> 5 in stock, what magic is that
> 
> ...


The seller is Amazon USA. 
Amazon has become scalpers themselves. 

Asking the price of 6900XTU Liquid Devil Ultimate for 6800XT is just crazy. 

Now 4 are in stock so it means somebody already bought one at this crazy price as well. 

Mindfactory has a 6800XT for 1399€.


			https://www.mindfactory.de/product_info.php/16GB-ASRock-Radeon-RX-6800-XT-PHANTOM-GAMING-DDR6_1386619.html
		


There are multiple 6900XT which are cheaper than this on Mindfactory.


			https://www.mindfactory.de/Hardware/Grafikkarten+(VGA)/Radeon+RX+Serie/RX+6900+XT.html


----------



## Felix123BU (Jun 8, 2021)

turbogear said:


> The seller is Amazon USA.
> Amazon has become scalpers themselves.
> 
> Asking the price of 6900XTU Liquid Devil Ultimate for 6800XT is just crazy.
> ...


Seems that Mindfactory has a lot of 6000 series in stock, and even Nvidia series, 3600, 3060Ti and 3080Ti...ofc at scalper prices  

It also seems that people started catching on that the shortage is coming to an end and are not paying scalper prices 

Again, I hope they rot on shelves until the prices go back to MSRP 

A big FU to all scalpers


----------



## turbogear (Jun 8, 2021)

Felix123BU said:


> Seems that Mindfactory has a lot of 6000 series in stock, and even Nvidia series, 3600, 3060Ti and 3080Ti...ofc at scalper prices
> 
> It also seems that people started catching on that the shortage is coming to an end and are not paying scalper prices
> 
> ...


I have been observing Mindfactory as my 6900XTU also came from them.
Until now it is like that they have big drops during the week but then sold out really quickly.
They are the only shop where they always had supply of RDNA2 cards since January and if you compare their price to other German shops it is always cheapest.

It seems they have special deals with many of the brands.
Actually it looks like that they have exclusive rights like EPIC in games  to get all cards only for them as many of other German shops had barely ever any cards on stock except for Mindfactory.

It also seems like they then set the base price for German market as they have most of the cards.


----------



## Felix123BU (Jun 8, 2021)

turbogear said:


> I have been observing Mindfactory as my 6900XTU also came from them.
> Until now it is like that they have big drops during the week but then sold out really quickly.
> They are the only shop where they always had supply of RDNA2 cards since January and if you compare their price to other German shops it is always cheapest.
> 
> ...


Yeah, it seems they have a lot more GPU's than other shops, and are cheaper than others.

One thing I still wonder, on the product page of the RX 6900 XT Red Devil it says over 770 pieces sold, I would make a rough estimation that the 6000 series sold a couple of ten thousand cards from the launch till today in the whole world, so how come the Steam Hardware survey shows 0 to this day ???

Both Nvidia and AMD had record GPU sales in Q4 last year, and even knowing that Nvidia's market share is 75%, that still would be a lot of GPU's sold by AMD since the launch of Navi 2.

According to Steam, they have around 20 million concurrent users per day, so lets say the AMD 6000 series sold 30000 GPU's (its not that much I would say), that would be 0.15%. Assuming that 1 out of 10 users received the steam hardware survey, and assuming that they correlate that to the number of polled people, its very very unlikely that the survey did not find any percentage of 6000 series GPUs, wtf, this is odd !?!?!


----------



## turbogear (Jun 8, 2021)

Felix123BU said:


> Yeah, it seems they have a lot more GPU's than other shops, and are cheaper than others.
> 
> One thing I still wonder, on the product page of the RX 6900 XT Red Devil it says over 770 pieces sold, I would make a rough estimation that the 6000 series sold a couple of ten thousand cards from the launch till today in the whole world, so how come the Steam Hardware survey shows 0 to this day ???
> 
> ...


Yes that is weird indeed.


----------



## ratirt (Jun 9, 2021)

The prices are still off the charts but the availability is higher for the 6900XT's. There are several available for purchase. 
The lowest price I found is $2300 for the 6900XT Gaming Trio. I think, the prices will not go down this year. If ever?


----------



## nguyen (Jun 9, 2021)

Felix123BU said:


> According to Steam, they have around 20 million concurrent users per day, so lets say the AMD 6000 series sold 30000 GPU's (its not that much I would say), that would be 0.15%. Assuming that 1 out of 10 users received the steam hardware survey, and assuming that they correlate that to the number of polled people, its very very unlikely that the survey did not find any percentage of 6000 series GPUs, wtf, this is odd !?!?!



20mil concurrent means that 20mil people are logging on Steam at any moment, it doesn't mean 20mil is the total amount of Steam users (well unless people play games 24/7)
Steam has around 120mil active accounts or more atm, each GPU need to have 180K units sold worldwide to make it into Steam Survey (lowest is 0.15%). 6700XT might make it into Steam Survey in the next few months.


----------



## Felix123BU (Jun 9, 2021)

nguyen said:


> 20mil concurrent means that 20mil people are logging on Steam at any moment, it doesn't mean 20mil is the total amount of Steam users (well unless people play games 24/7)
> Steam has around 120mil active accounts or more atm, each GPU need to have 180K units sold worldwide to make it into Steam Survey (lowest is 0.15%). 6700XT might make it into Steam Survey in the next few months.


So according to you, Nvidia sold more than 180.000 of 3090's, I find that hard to believe  ... but then people paid 1800+ EUR on 6800 XT's, so everything is possible, this "shortage" made everyone act foolish 

That, or miners logging into Steam with their mining rigs


----------



## nguyen (Jun 9, 2021)

Felix123BU said:


> So according to you, Nvidia sold more than 180.000 of 3090's, I find that hard to believe  ... but then people paid 1800+ EUR on 6800 XT's, so everything is possible, this "shortage" made everyone act foolish
> 
> That, or miners logging into Steam with their mining rigs



Nvidia earned 2.76bil in Q1 2021 selling GPUs alone, if their average selling price for all Ampere GPU (3060, 3060tie, 3070, etc...) is 700usd (which is quite unlikely already because Nvidia sell chips to AIBs), that would mean 3.94mil chips were sold in 3 months or 1.3mil chips in 1 month
Yes there is no doubt Nvidia sold over 400K of 3090 since Sep2020


----------



## Felix123BU (Jun 9, 2021)

nguyen said:


> Nvidia earned 2.76bil in Q1 2021 selling GPUs alone, if their average selling price for all Ampere GPU (3060, 3060tie, 3070, etc...) is 700usd (which is quite unlikely already because Nvidia sell chips to AIBs), that would mean 3.94mil chips were sold in 3 months or 1.3mil chips in 1 month
> Yes there is no doubt Nvidia sold over 400K of 3090 since Sep2020


No doubt, none, except that if you are not Leather Man you can not make those statements as absolute facts


----------



## xenosys (Jun 11, 2021)

Little advice from someone who's a little more experienced than I am with regards to this would be appreciated ...

I'm trying to disassemble my EK Quantum Vector 6800/6900 water block to reapply some thermal paste to the GPU die, but it would appear one of the mounting screws has been stripped, so I'm unable to grip the screw with a screwdriver to pull it out.  I've tried the elastic band trick to try and increase friction between the two, but it's not working.  Does anyone have any advice on how to remove it without exerting dangerous amount of pressure on the PCB?

Thanks!


----------



## BarbaricSoul (Jun 11, 2021)

Very small easyout bit?


----------



## xenosys (Jun 11, 2021)

BarbaricSoul said:


> Very small easyout bit?



Good idea.  Is there any you would recommend?  It's for a M2.5 screw, and most bits I've seen go from 3mm up.


----------



## BarbaricSoul (Jun 11, 2021)

never had to use a easyout for a screw that small, so no experience doing it myself. don't even know for sure if you can get one small enough. 

honestly, unless it's a matter of the card overheating, I wouldn't F... with it. GPU are too damn expensive and hard to get to be doing stuff like disassembly without a "have to" reason. when I sold my rRTX 2070, it had a failed fan. because of it being a founders edition, and what I read about how complicated the heatsink is the remiove and reinstall, I knocked $100 off the price and sold it as is. I even had the replacement fan. Good thing, turned out I had the wrong size thermal pads.


----------



## Felix123BU (Jun 11, 2021)

xenosys said:


> Little advice from someone who's a little more experienced than I am with regards to this would be appreciated ...
> 
> I'm trying to disassemble my EK Quantum Vector 6800/6900 water block to reapply some thermal paste to the GPU die, but it would appear one of the mounting screws has been stripped, so I'm unable to grip the screw with a screwdriver to pull it out.  I've tried the elastic band trick to try and increase friction between the two, but it's not working.  Does anyone have any advice on how to remove it without exerting dangerous amount of pressure on the PCB?
> 
> Thanks!


A pair of finesse pliers (or any pliers that can grip the screw) if there is enough mass to grip it with the pliers, can do the trick, but it depends a lot on how the screwhead is shaped. And its a rather delicate operation since you need to grip it just right, but it can be done if really needed.


----------



## turbogear (Jun 11, 2021)

xenosys said:


> Good idea.  Is there any you would recommend?  It's for a M2.5 screw, and most bits I've seen go from 3mm up.


Damaged screw extractors can help here.
I used myself one to extract a damaged screw on my bike hydraulic breaks.
It seems there are also some available in small screw sizes.
Here is an example from Amazon:





						Dalebouti Damaged Screw Extractor Set 22 Pcs Stripped Screw Extractor Kit All-Purpose Screwdriver Bits for Broken Bolt Stripped Screw Extractors - - Amazon.com
					

Dalebouti Damaged Screw Extractor Set 22 Pcs Stripped Screw Extractor Kit All-Purpose Screwdriver Bits for Broken Bolt Stripped Screw Extractors - - Amazon.com



					www.amazon.com
				




I did it manually and did not use drill like they show in this example from Amazon. 
I did not wanted to damage my expensive Magura hydraulic break.


----------



## xenosys (Jun 11, 2021)

Nice guys, thanks for your advice, I'll give it a crack next week when the things I've ordered arrive.

Turbo, would that set include drill bits small enough for those screws?  EDIT ... nvm, those go down to 1.5mm so they should be fine,  thanks.


----------



## Godhand007 (Jun 12, 2021)

xenosys said:


> Nice guys, thanks for your advice, I'll give it a crack next week when the things I've ordered arrive.
> 
> Turbo, would that set include drill bits small enough for those screws?  EDIT ... nvm, those go down to 1.5mm so they should be fine,  thanks.


Can you share a pic of the actual screw? I want to know what is the actual distorted shape of the screw that you are facing an issue with.


----------



## jesdals (Jun 19, 2021)

Decided to join the Thermal Pad on reference backplate club - will do some identical testing to see if there is any advantage




Using Thermal Grizzly Minus Pad 8 - 3mm thick on memory areas and 1,5mm on rest




Mem temp ind Destiny 2 - ambient about 27c today
Before test under - ambient 21c




About 8c difference in Destiny 2


----------



## mama (Jun 19, 2021)

jesdals said:


> Decided to join the Thermal Pad on reference backplate club - will do some identical testing to see if there is any advantage
> View attachment 204548
> Using Thermal Grizzly Minus Pad 8 - 3mm thick on memory areas and 1,5mm on rest
> 
> ...


Better.  Try undervolting for lower temps again.


----------



## Butanding1987 (Jun 20, 2021)

jesdals said:


> Decided to join the Thermal Pad on reference backplate club - will do some identical testing to see if there is any advantage
> 
> Using Thermal Grizzly Minus Pad 8 - 3mm thick on memory areas and 1,5mm on rest
> 
> ...


So you plastered the card with pads and replaced the pad on the GPU with paste?


----------



## GamerGuy (Jun 20, 2021)

I'd like to do what you guys have done, but in my neck of the woods, successful RMA (should I need one) requires that the stickers applied on my card by the local distributers be untouched. IF there is or are signs that these stickers have been tempered with, or removed, the warranty is immediately void. So, I guess I'll have to contend myself with such nice mods vicariously...


----------



## jesdals (Jun 20, 2021)

Butanding1987 said:


> So you plastered the card with pads and replaced the pad on the GPU with paste?


Only the backplate, so no major operation only the backplate screws


----------



## turbogear (Jun 20, 2021)

GamerGuy said:


> I'd like to do what you guys have done, but in my neck of the woods, successful RMA (should I need one) requires that the stickers applied on my card by the local distributers be untouched. IF there is or are signs that these stickers have been tempered with, or removed, the warranty is immediately void. So, I guess I'll have to contend myself with such nice mods vicariously...


You can buy warranty stickers at Aliexpress.  
I bought some myself. 
They have almost for every brand. 
Somebody here in forum gave me this hint.



jesdals said:


> Only the backplate, so no major operation only the backplate screws


Yes putting paste on reference cooler may not work anyways, because the distance between the cooler and GPU Die may not be optimum as it is designed to be used with relatively thick Thermal pad so with paste maybe the screws need to tightened really hard to get contact with heatsink.


----------



## turbogear (Jun 22, 2021)

AMD has launched the driver with FSR (FidelityFX Super Resolution) support:


			https://www.amd.com/en/support/kb/release-notes/rn-rad-win-21-6-1
		


People who has Godfall may be able test it but maybe game will get an update to support this?

Some info for who has not heard about what FSR is: 


			https://www.amd.com/en/technologies/radeon-software-fidelityfx-super-resolution?utm_campaign=fsr&utm_medium=redirect&utm_source=301


----------



## Space Lynx (Jun 22, 2021)

turbogear said:


> AMD has launched the driver with FSR (FidelityFX Super Resolution) support:
> 
> 
> https://www.amd.com/en/support/kb/release-notes/rn-rad-win-21-6-1
> ...




here is to hoping we get flooded with FSR reviews in next 24-48 hours, it says "select games" plural, so there is indeed more than one supported with this driver release.


----------



## turbogear (Jun 22, 2021)

lynx29 said:


> here is to hoping we get flooded with FSR reviews in next 24-48 hours, it says "select games" plural, so there is indeed more than one supported with this driver release.


Let's hope @W1zzard will do some review for us.


----------



## ratirt (Jun 22, 2021)

I'm looking forward for the reviews. I'm sure it won't be breathtaking but I think it's good to have an option like FSR. 
I seen a list of 12 (if I'm correct) at start but maybe it was 10?


----------



## Space Lynx (Jun 22, 2021)

ratirt said:


> I'm looking forward for the reviews. I'm sure it won't be breathtaking but I think it's good to have an option like FSR.
> I seen a list of 12 (if I'm correct) at start but maybe it was 10?



I'm just hoping it's better than just lowering resolution side by side... that's main thing. If it's better than that, then it is worthy. I'm not expecting DLSS 2.0 quality, but as long as it can manage to do something that beats a side by side comparison of just lowering rez, then I'd be happy.


----------



## turbogear (Jun 22, 2021)

ratirt said:


> I'm looking forward for the reviews. I'm sure it won't be breathtaking but I think it's good to have an option like FSR.
> I seen a list of 12 (if I'm correct) at start but maybe it was 10?


According to this article on TPU, 7 games will be support at launch and 12 will be added in near future.









						AMD FSR Supporting 7 Games at Launch, 12 More Games to be Added in the Near Future
					

AMD's DLSS competitor FidelityFX Super Resolution (FSR) is going to be launched in a mere five days, on June 22nd. When AMD announced the technology last month, they used Godfall as a showcase for the improved performance characteristics of the technology, which should aid (particularly) in...




					www.techpowerup.com
				




I have to say from these 7, I only know 2.   
I even don't own any of these 2.
@ratirt you have GodFall. Maybe I will also pick it if you think it is fun game.


----------



## GamerGuy (Jun 22, 2021)

It starts off with 7 games with support, including Godfall, and about 12 incoming games (including FC6, RE Village) with FSR support, and a whole slew of game devs that would support the feature. I'll probably get Godfall just so I can test it on both my Vega64 and RX 6900 XT rigs.....and see if it runs on the GTX 1080.


----------



## ratirt (Jun 22, 2021)

GamerGuy said:


> It starts off with 7 games with support, including Godfall, and about 12 incoming games (including FC6, RE Village) with FSR support, and a whole slew of game devs that would support the feature. I'll probably get Godfall just so I can test it on both my Vega64 and RX 6900 XT rigs.....and see if it runs on the GTX 1080.


I got a 6900XT and a 5600XT so it would be great to get it tested. God Fall is a cool game and I've been wasting my lifetime of this one for quite a while you know. I assume only (or it is my guess) that the implementation in the God Fall will be pretty good. I'm really curious although I can play that game without FSR or any other FPS boost with no significant FPS drops on my 6900XT but to test this FSR feature with my 5600XT is going to be a great opportunity.


----------



## turbogear (Jun 22, 2021)

GamerGuy said:


> It starts off with 7 games with support, including Godfall, and about 12 incoming games (including FC6, RE Village) with FSR support, and a whole slew of game devs that would support the feature. I'll probably get Godfall just so I can test it on both my Vega64 and RX 6900 XT rigs.....and see if it runs on the GTX 1080.


I am big fan of Far Cry games. FC6 will support it. 
Pity that this game has been delayed until Autumn.
I will get it for free  as promotion with 5800X processor that I bought in December last year.



ratirt said:


> I got a 6900XT and a 5600XT so it would be great to get it tested. God Fall is a cool game and I've been wasting my lifetime of this one for quite a while you know. I assume only (or it is my guess) that the implementation in the God Fall will be pretty good. I'm really curious although I can play that game without FSR or any other FPS boost with no significant FPS drops on my 6900XT but to test this FSR feature with my 5600XT is going to be a great opportunity.


Yes, I have also high FPS with most of the games I play on my 2K 165Hz monitor with my 6900XTU OCed to 2750MHz@1175mV.

It would be really nice if they will add it also to Cyberpunk. That is the game which drops significantly when Raytracing enabled. Without Raytracing the FPS hover in range of 90-110.


----------



## ratirt (Jun 22, 2021)

turbogear said:


> I am big fan of Far Cry games. FC6 will support it.
> Pity that this game has been delayed until Autumn.
> I will get it for free  as promotion with 5800X processor that I bought in December last year.
> 
> ...


Actually, I play 4k 144Hz and the 6900XT does the trick  
I know it will be implemented in the CP2077. You already have thge fidelity FX there. I read this news somewhere with the CP2077. A lot of developers are interested in implementing FSR. People can say whatever they want and praise Nvidia but the fact of the matter is the FSR will be a game changer if it turns out to work on all of the hardware regardless. Improvements for the implementations will come as well and I'm pretty sure this is going to happen faster than with DLSS 1.0 due to huge number of interested devs and that is because FSR is open for everyone and, APUs which is a kicker here.


----------



## puma99dk| (Jun 22, 2021)

For now I got 2, this wasn't the plan so it kinda sucks because I could more use the money then 2 GPU's


----------



## ratirt (Jun 22, 2021)

turbogear said:


> @ratirt you have GodFall. Maybe I will also pick it if you think it is fun game.


I really like this game so damn much and it looks outstanding. I really wanna see the FSR on this one just to see how good it is. As for the game itself it is really entertaining. 
I sometimes just launch any stage and run and kill mobs and practice my fighting techniques


----------



## turbogear (Jun 22, 2021)

puma99dk| said:


> For now I got 2, this wasn't the plan so it kinda sucks because I could more use the money then 2 GPU's


If you don't need two GPUs, you can sell one on eBay.  
At the moment, I think the market is good for used GPUs.

I sold my Radeon VII in December to get 6800XT.
I was blown away that people raised the price from starting point of 350€ to 700€ in the last few minutes before betting ended. 
The card was 1.5years old and originally I paid 800€ for it and got back 700€.


----------



## puma99dk| (Jun 22, 2021)

turbogear said:


> If you don't need two GPUs, you can sell one on eBay.
> At the moment, I think the market is good for used GPUs.
> 
> I sold my Radeon VII in December to get 6800XT.
> ...



I don't sell on ebay because if I go outside my country shipping is like 100USD if not more and no one want to pay that I think.

But yeah it's crazy times right now.


----------



## turbogear (Jun 22, 2021)

puma99dk| said:


> I don't sell on ebay because if I go outside my country shipping is like 100USD if not more and no one want to pay that I think.
> 
> But yeah it's crazy times right now.


I sold in past many old computer stuff on eBay but I specify usually that I am shipping and selling the item only in Germany.


----------



## puma99dk| (Jun 22, 2021)

turbogear said:


> I sold in past many old computer stuff on eBay but I specify usually that I am shipping and selling the item only in Germany.



Yeah, ebay ain't as great on this side of the Germany boarder, we order from ebay.de because prices are better


----------



## turbogear (Jun 22, 2021)

puma99dk| said:


> Yeah, ebay ain't as great on this side of the Germany boarder, we order from ebay.de because prices are better


Yes eBay works quite well over here to get rid of old computer stuff and at the same time earn some money to invest into new hardware.  
Currently old GPUs has nice second hand market so you will be able to sell your old one.

I am going to sell now a GTX 1070Ti.
I got for my son 6800XT for good price and will sell his 1070Ti.
I was looking and the used 1070Ti goes for 350€ on eBay. 
I bought it new 3 years ago for 400€.  Crazy how the prices has gone up. 
People are betting quite high prices just to get a GPU.


----------



## Felix123BU (Jun 22, 2021)

Having looked at all the reviews on FSR I could find, I am relatively impressed. Have no game out of those present at launch, so cant have a subjective own eyes opinion, but from what I have seen, its rather impressive. Not bad Amd, not bad, now stop trying to be Nvidia with prices and things might look up for the GPU division


----------



## GamerGuy (Jun 22, 2021)

I just completed downloading Godfall, and have just installed Adrenalin 21.6.1, but it's kinda late here (about 2am), so I'm gonna hit the sack and try out Godfall later. Will also install the game in my other two rigs and see how they do as well.


----------



## Felix123BU (Jun 22, 2021)

GamerGuy said:


> I just completed downloading Godfall, and have just installed Adrenalin 21.6.1, but it's kinda late here (about 2am), so I'm gonna hit the sack and try out Godfall later. Will also install the game in my other two rigs and see how they do as well.


Couple of reviewers said that FSR does not need a specific driver, should work with any driver since its implemented in the game engine, so I am curios, what does the new driver do for FSR?


----------



## turbogear (Jun 22, 2021)

I ran Time Spy with the latest 21.6.1 driver with my 6900XTU Liquid Devil Ultimate. 
Here is the result. 
This is place 10 sort towards the graphics scores of 6900XT cards on 3Dmark website. 






The setting is 2750MHz@1175mV. See the screenshot at the end of post for details.
It is the same setting that I have been using since long time and is really stable in every game.



One thing is that I tried today for the first time to OC also the VRAM.
It does not seem that the GPU likes memory OC when the GPU core is OCed that high. 
If I OC even 50MHz on memory (Fast Timing) with GPU OCed to 2750MHz, the Time Spy does not complete a full run. It stops near to the end of GP2 test. 

If I set the GPU clock to default and then do auto OC only on VRAM, I can go to 2180MHz, but when GPU Core is OCed at 2750MHz then even 2050MHz on VRAM did not work . 
Seems like my card does not like memory OC in addition to Core OC , but still I am very satisfied with the Performance. 




These are the setting for which the benchmark was run.


MPT settings:


----------



## Felix123BU (Jun 23, 2021)

turbogear said:


> I ran Time Spy with the latest 21.6.1 driver with my 6900XTU Liquid Devil Ultimate.
> Here is the result.
> This is place 10 sort towards the graphics scores of 6900XT cards on 3Dmark website.
> 
> ...


I also seem to get a nice performance boost overall with driver 21.6.1, tested CP2077, +7FPS (+9%) and Shadow of the Tomb Raider, +6 FPS (+6%), in SOTR there is a 17% improvement from the first launch driver to todays driver for me , very very nice


----------



## turbogear (Jun 23, 2021)

Felix123BU said:


> I also seem to get a nice performance boost overall with driver 21.6.1, tested CP2077, +7FPS (+9%) and Shadow of the Tomb Raider, +6 FPS (+6%), in SOTR there is a 17% improvement from the first launch driver to todays driver for me , very very nice


I need also to run AC Valhalla and Borderlands 3 benchmarks to see the increase compared to my old results.  

That is what we call AMD fine wine approach. 

It will be great if one of reviewers do update on the reviews to see how has the RDNA2 and RTX series gained performance compared to each other since launch with driver improvements over time.


----------



## Godhand007 (Jun 23, 2021)

Felix123BU said:


> Having looked at all the reviews on FSR I could find, I am relatively impressed. Have no game out of those present at launch, so cant have a subjective own eyes opinion, but from what I have seen, its rather impressive. Not bad Amd, not bad, now stop trying to be Nvidia with prices and things might look up for the GPU division


You can try RiftBreaker Demo for free.


----------



## turbogear (Jun 23, 2021)

Felix123BU said:


> Having looked at all the reviews on FSR I could find, I am relatively impressed. Have no game out of those present at launch, so cant have a subjective own eyes opinion, but from what I have seen, its rather impressive. Not bad Amd, not bad, now stop trying to be Nvidia with prices and things might look up for the GPU division


To be honest I am also positively surprised after reading the review from W1zzard.  
I was not expecting this, especially after the rumors that were spreading since many weeks claiming this will be a total disaster for AMD. 

I am thinking of buying GodFall and test it.
Seems like a fun game. 

The funny part is that even older generations of Nvidia GPUs are also supported so they can also get this free FPS boost thanks to AMD. 
It would be nice if Nvidia returns the favor and also releases something like DLSS where we AMD GPU owners can benefit but maybe I am just dreaming.  



Godhand007 said:


> You can try RiftBreaker Demo for free.


Thanks for the hint.


----------



## Felix123BU (Jun 23, 2021)

Godhand007 said:


> You can try RiftBreaker Demo for free.


Doing that, I am really curios to see FSR with my own eyes, YouTube video is by itself compressed and does not give a clear picture. Lets see  



turbogear said:


> To be honest I am also positively surprised after reading the review from W1zzard.
> I was not expecting this, especially after the rumors that were spreading since many weeks claiming this will be a total disaster for AMD.
> 
> I am thinking of buying GodFall and test it.
> ...


Nvidia releasing DLSS for the masses? Nope, for 3 reasons, first being that if DLSS works on normal GPU's, that would be an admission of lying about Turing and Ampere having specialized cores for DLSS, they can not do that , and second would be that they have no more incentive for pushing tensor cores, so new series of Gpu's , and third, they would loose marketing power, no more "only our new shit is the best that can run DLSS on hardware" 

I really don't care about DLSS, I care more for an industry open standard that can help the whole industry move forward and bring improvements across the board. And for that matter, I don't really care about FSR that much either, I am just curios to see how it works, prefer native resolution over any fancy upscaling gimmick all day long


----------



## turbogear (Jun 23, 2021)

Felix123BU said:


> Doing that, I am really curios to see FSR with my own eyes, YouTube video is by itself compressed and does not give a clear picture. Lets see
> 
> 
> Nvidia releasing DLSS for the masses? Nope, for 3 reasons, first being that if DLSS works on normal GPU's, that would be an admission of lying about Turing and Ampere having specialized cores for DLSS, they can not do that , and second would be that they have no more incentive for pushing tensor cores, so new series of Gpu's , and third, they would loose marketing power, no more "only our new shit is the best that can run DLSS on hardware"
> ...


I agree.
I will also prefer native resolution as long as my GPU does not run into trouble for example on 4k with Raytracing enabled.  

With my current 2k monitor I am actually worried from going above the 165Hz FreeSync range in some games with heavily Oced 6900XTU.


----------



## Felix123BU (Jun 23, 2021)

Godhand007 said:


> You can try RiftBreaker Demo for free.


Ok, so tried the RiftBreaker Demo, I am rather positively impressed, at 3440x1440 native vs FSR Ultra, the fine details are almost all there, the difference I can perceive is the tone of the image, the FSR-ed image is a very tiny bit softer overall, but you sort of have to look closely to see the difference. I also perceive the static image as a bit better than an image in motion in this game. Performance mode is crap, very smeary, but double the FPS  

This game does not seem too sharp even at native, I would be even more curious about FSR in a game with super high rez textures and lots of very intricate details. For Riftbreaker, if you need the extra FPS, FSR is perfect, at least on Ultra and to a lesser degree on Quality.


After seeing FSR in Riftbreaker, I can say only one thing, Cyberpunk needs this tech so very bad on AMD 6000 gpu's, I would have 0 issues turning on FSR Ultra or Quality with RT in Cyberpunk 

And btw, RiftBreaker seems a nice game, I might give it a shot when I will have some free time  



turbogear said:


> I agree.
> I will also prefer native resolution as long as my GPU does not run into trouble for example on 4k with Raytracing enabled.
> 
> With my current 2k monitor I am actually worried from going above the 165Hz FreeSync range in some games with heavily Oced 6900XTU.


Yeah, my biggest worry since I got the 6800XT is also going above my 100hz Freesync range at 3440x1440, this is the meaning of being spoiled


----------



## Guwapo77 (Jun 23, 2021)

I'm new to this place and I just saw this post, so I know I'm late as all hell.  Here it is anyways...


----------



## Felix123BU (Jun 23, 2021)

Guwapo77 said:


> I'm new to this place and I just saw this post, so I know I'm late as all hell.  Here it is anyways...
> 
> View attachment 205100View attachment 205101View attachment 205102View attachment 205104


Welcome, the red theme looks really nice


----------



## turbogear (Jun 23, 2021)

Guwapo77 said:


> I'm new to this place and I just saw this post, so I know I'm late as all hell.  Here it is anyways...
> 
> View attachment 205100View attachment 205101View attachment 205102View attachment 205104


Welcome to the club. 
I have to say your setup looks damn good.


----------



## Guwapo77 (Jun 23, 2021)

Felix123BU said:


> Welcome, the red theme looks really nice


Thank you, I greatly appreciate it.



turbogear said:


> Welcome to the club.
> I have to say your setup looks damn good.


Thank you very much!  /salute


----------



## turbogear (Jun 23, 2021)

Felix123BU said:


> After seeing FSR in Riftbreaker, I can say only one thing, Cyberpunk needs this tech so very bad on AMD 6000 gpu's, I would have 0 issues turning on FSR Ultra or Quality with RT in Cyberpunk


I have to agree as I also wrote in FSR review forum, it would be great if CD Project Red will support FSR.
It would be useful with Raytracing enabled. 



Guwapo77 said:


> I'm new to this place and I just saw this post, so I know I'm late as all hell.  Here it is anyways...
> 
> View attachment 205100



Interesting that you had Radeon R9 FURY.
I owned this card before I went over the years: Vega 64-->Radeon VII--> XFX reference 6800XT-->PowerColor 6900XTU Liquid Devil Ultimate.  

Do you own Red Devil 6800XT or 6900XT?
The both look the same.


----------



## jesdals (Jun 23, 2021)

Guwapo77 said:


> I'm new to this place and I just saw this post, so I know I'm late as all hell.  Here it is anyways...


Love your Eyefinity setup - congrats

Please fill out your system specs


----------



## Guwapo77 (Jun 23, 2021)

turbogear said:


> I have to agree as I also wrote in FSR review forum, it would be great if CD Project Red will support FSR.
> It would be useful with Raytracing enabled.
> 
> 
> ...


I had two Fury X's in CrossFire (Yes I know, I'm laughing on the inside as well).  To answer your question, its the 6900XT.  Next generation, I will hold out for the top of the line AIO Super Special Turbo Elite Edition too!  I do normally, but this shortage had me freaking out.  Once I saw this card for $1400, I jumped on it before prices hit astronomical prices.



jesdals said:


> Love your Eyefinity setup - congrats
> 
> Please fill out your system specs


Thanks bro.  I thought my little CPU-Z link would do the trick.  LoL  Ok I'll do the non-lazy version later.


----------



## turbogear (Jun 23, 2021)

Guwapo77 said:


> I had two Fury X's in CrossFire (Yes I know, I'm laughing on the inside as well).  To answer your question, its the 6900XT.  Next generation, I will hold out for the top of the line AIO Super Special Turbo Elite Edition too!  I do normally, but this shortage had me freaking out.  Once I saw this card for $1400, I jumped on it before prices hit astronomical prices.
> 
> 
> Thanks bro.  I thought my little CPU-Z link would do the trick.  LoL  Ok I'll do the non-lazy version later.


1400$ is a great price for 6900XT Red Devil.  
Currently they cost about 300 more than that here in Germany.

Fury X was nice at that time. I sold it for good price when Vega 64 came.

I was satisfied actually with my 6800XT which I had water-cooled and it was Oc monster running at 2750MHz@1020mV but unfortunately it got damaged with ESD issue and I got money back for that and then I went for 6900XTU.


----------



## Felix123BU (Jun 23, 2021)

I forgot to add, FSR is sooo much better than the Fidelity FX resolution modifier or the Radeon Boost thing, congrats AMD


----------



## Guwapo77 (Jun 23, 2021)

turbogear said:


> 1400$ is a great price for 6900XT Red Devil.
> Currently they cost about 300 more than that here in Germany.
> 
> Fury X was nice at that time. I sold it for good price when Vega 64 came.
> ...


I'm here in Germany as well; however, I got it back in February from Computer Universe before everything skyrocketed...  But I don't pay VAT either.


----------



## jesdals (Jun 23, 2021)

Guwapo77 said:


> Thanks bro.  I thought my little CPU-Z link would do the trick.  LoL  Ok I'll do the non-lazy version later.


Nice setup - you should see this thread to https://www.techpowerup.com/forums/threads/aorus-x570-master.257392/


----------



## Guwapo77 (Jun 23, 2021)

jesdals said:


> Nice setup - you should see this thread to https://www.techpowerup.com/forums/threads/aorus-x570-master.257392/


I see, thanks for that page.  I sure as hell could have used this before F33!


----------



## turbogear (Jun 24, 2021)

Felix123BU said:


> Ok, so tried the RiftBreaker Demo, I am rather positively impressed, at 3440x1440 native vs FSR Ultra, the fine details are almost all there, the difference I can perceive is the tone of the image, the FSR-ed image is a very tiny bit softer overall, but you sort of have to look closely to see the difference. I also perceive the static image as a bit better than an image in motion in this game. Performance mode is crap, very smeary, but double the FPS


I tried the demo also. 
Yes the FSR Ultra does not show me also any image quality loss.
I have to say though the game is no challenge for my setup with 2K 165Hz monitor and the heavily Oced 6900XT. 
It runs at 165FPS almost all the time  with Vsync enable even without the FSR.


----------



## Butanding1987 (Jun 27, 2021)

I don't know why I'm on Time Spy Extreme. Should have fiddled with Fire Strike instead. LOL.
I think I can still push  it further. It's 36 degrees Celsius now in Manila. I think I should haul the PC to the bedroom where's there's AC.


----------



## Garlic (Jul 4, 2021)

Is it still not possible to increase the gpu voltage via SPPT yet?


----------



## kiddagoat (Jul 4, 2021)

@turbogear How are you liking your 6900XTU?  If you don't mind my asking, what was the MSRP in your country for the card?  Were you able to get it for around MSRP?  I am considering getting a Liquid Devil because my local Microcenter has a few of them in stock.  I am still debating between the 6800XT, 6900XT, or the 6900XTU.     I am hoping that I will not need to upgrade my GPU again for a good while after this.   

I was reading that the Liquid Devil 6900XTU was MSRP for 2500 USD.


----------



## Kabouter Plop (Jul 4, 2021)

Is something wrong with my gpu, memory clocks sometimes even peek to max 14 ghz even reports this inside amd performance tab it peeks for 1 second to those clocks then drops back down


----------



## Ja.KooLit (Jul 4, 2021)

Kabouter Plop said:


> View attachment 206618
> Is something wrong with my gpu, memory clocks sometimes even peek to max 14 ghz even reports this inside amd performance tab it peeks for 1 second to those clocks then drops back down


could be hwinfo bug? using the beta one?


----------



## Kabouter Plop (Jul 4, 2021)

Its showing those peeks in amd performance tab as well


----------



## turbogear (Jul 4, 2021)

kiddagoat said:


> @turbogear How are you liking your 6900XTU?  If you don't mind my asking, what was the MSRP in your country for the card?  Were you able to get it for around MSRP?  I am considering getting a Liquid Devil because my local Microcenter has a few of them in stock.  I am still debating between the 6800XT, 6900XT, or the 6900XTU.     I am hoping that I will not need to upgrade my GPU again for a good while after this.
> 
> I was reading that the Liquid Devil 6900XTU was MSRP for 2500 USD.


Well I don't know the MSRP for the card, but it cost me 2199€ back in April.
Now these are selling even more expensive for about 2500€.


----------



## Space Lynx (Jul 4, 2021)

Kabouter Plop said:


> View attachment 206618
> Is something wrong with my gpu, memory clocks sometimes even peek to max 14 ghz even reports this inside amd performance tab it peeks for 1 second to those clocks then drops back down



plot twist, you accidently went into a time portal and arrived back to our present time with a hyper next gen graphics card, 5ghz on air is the normal from that timeline.  well done! time to buy a 360hz monitor!


----------



## Kabouter Plop (Jul 4, 2021)

lynx29 said:


> plot twist, you accidently went into a time portal and arrived back to our present time with a hyper next gen graphics card, 5ghz on air is the normal from that timeline.  well done! time to buy a 360hz monitor!


Your not being funny, i wanna know if this is normal behaviour or if my 6900 XT is overclocking it self


----------



## Space Lynx (Jul 4, 2021)

Kabouter Plop said:


> Your not being funny, i wanna know if this is normal behaviour or if my 6900 XT is overclocking it self



lol its clearly a bug. no need to worry about it. if it was hitting 5ghz actual it would have fried or shut down your PC. so yeah its just a bug.  why not use GPUZ to monitor it instead of HWinfo then compare.


----------



## Kabouter Plop (Jul 4, 2021)

Its reporting those spikes in radeon own performance metrics as well, heck i just checked gpu-z and it reports same spikes in there as well


----------



## Space Lynx (Jul 4, 2021)

Kabouter Plop said:


> Its reporting those spikes in radeon own performance metrics as well, heck i just checked gpu-z and it reports same spikes in there as well



very odd indeed! hmm.  maybe a clean factory reset of latest driver or if your already on the latest driver, maybe roll back to the previous WHQL driver and re-test?


----------



## GamerGuy (Jul 5, 2021)

Felix123BU said:


> I forgot to add, FSR is sooo much better than the Fidelity FX resolution modifier or the Radeon Boost thing, congrats AMD


DF was so negative, some had even speculated that DF's in nVidia's pocket with its main focus on image quality. I think all of us knew that a software based res upscaler would not match the RTX's hardware + software DLSS implementation. DF failed to highlight that FSR can be used on older cards from both AMD and nVidia, espcially the still very capable GTX 1000 and GTX 900 series (which can leverage FSR) and just focused on image quality (I understand that, but to highlight all the negatives without talking about the one great positive is a fail to me).

I had fun with Godfall at 4K with FSR set to 'Quality' preset, ingame setting at EPIC and the game ran between 40fps to 60fps and was just fine. This was with my Leadtek GTX 1080 Hurricane OC! nVidia has totally forgotten about the Pascal and 2nd gen Maxwell cards, not even bothering to help uplift their performance and focusing all their efforts on their secksay RTX line. BTW, strangely enough, I tried Godfall with FSR on my Vega 64 and it CTD'ed, what the heck happened?!

Edit - Didn't bother with FSR on my main rig as it can do Godfall at EPIC preset + RT and still get ~100fps at 3840x1080. I'm thinking of getting RE Village, anyone tried it with their RX 6000 cards yet?


----------



## turbogear (Jul 5, 2021)

Kabouter Plop said:


> Its showing those peeks in amd performance tab as well


That is really strange behavior. I am not sure how many could at all hit 5GHz. 
Is your memory settings at default?
Maybe try with DDU cleanly removing the drivers and installing them again.


----------



## GamerGuy (Jul 5, 2021)

Kabouter Plop said:


> Its showing those peeks in amd performance tab as well


I just saw this, I'd noticed this on my card as well, using GPUZ sensors with max reading, I'd put it to GPUZ reading it wrong.....I didn't know HWinfo is showing similar max reading


----------



## ratirt (Jul 5, 2021)

That is weird. My 6900xt red devil shows correct values in the HWinfo and AMD driver. Did you try reinstalling the driver and the HWinfo?


----------



## turbogear (Jul 5, 2021)

GamerGuy said:


> I just saw this, I'd noticed this on my card as well, using GPUZ sensors with max reading, I'd put it to GPUZ reading it wrong.....I didn't know HWinfo is showing similar max reading


That is interesting.
I am quite sure I did not see it on 6800XT as there I was Ocing VRAM and monitored it closely.

I need to check if I can also see something like this on my 6900XTU Liquid Devil Ultimate.
I have not monitored VRAM closely in the last times on 6900XTU as I am not Ocing the memory.
As a matter of fact, my 6900XTU does not OC on VRAM at all not even 50MHz. 

Maybe I need to do Buildzoid style of Hardware mode to allow the memory to OC. 
On the second thought, no no.... I am not brave enough to mode super expensive card.


----------



## Felix123BU (Jul 5, 2021)

The stratospheric Ghz in memory is most likely due some bug or driver (for me 21.6.1, no whql version), saw this yesterday evening in Hwinfo64, 5.something ghz in memory, lol-ed at it, then proceeded to DDU and installed newest driver, have not seen these weird readings again, 2.1 Ghz max as set in wattman. 

Would be fun though if AMD put some 5ghz mem chips and locked them to 2 just to mess with us


----------



## Kabouter Plop (Jul 5, 2021)

I did DDU and it had no impact it still reports it it wrong, im glad i am not alone.
It spikes only when the clocks increase from non idle state as it drops from idle to maximum memory clock but then spikes above 2 ghz even towards 12 ghz or 13 ghz


----------



## GamerGuy (Jul 5, 2021)

I think GPUZ 2.40.0 is still reading mem clock wrongly (or less wrongly compared to previous version), GPU clock looks okay, but the massive VRAM usage for RE Village is a bit of a shocker for me, >14GB is a tad much, isn't it? Or, is GPUZ reading that wrongly as well...this is after an hour of RE Village. Do note the massive amount of VRAM used, likewise for screenshot below (can't recall what game I'd been playing though).





Below is a GPUZ 2.38.0 screenshot of my RX 6900 XT, do note the ridiculous mem clock speed, but still not as bad as Kabouter Plop's mem spikes though.


----------



## Space Lynx (Jul 5, 2021)

turbogear said:


> On the second thought, no no.... I am not brave enough to mode super expensive card.



probably good idea to stick with this


----------



## Felix123BU (Jul 5, 2021)

GamerGuy said:


> I think GPUZ 2.40.0 is still reading mem clock wrongly (or less wrongly compared to previous version), GPU clock looks okay, but the massive VRAM usage for RE Village is a bit of a shocker for me, >14GB is a tad much, isn't it? Or, is GPUZ reading that wrongly as well...this is after an hour of RE Village. Do note the massive amount of VRAM used, likewise for screenshot below (can't recall what game I'd been playing though).
> 
> 
> 
> ...


The overall memory usage is deffo incorrect, as in actual on the spot usage, after a while mine says 20GB usage on desktop without anything in the background. It loads up bit by bit and does not drop until a restart, max I saw was 34gb  . I guess its actual mem usage plus what windows caches.

Speaking of windows, dying to try out Win 11, especially if it has the Auto HDR feature which was delayed for Win 10.


----------



## Kabouter Plop (Jul 5, 2021)

I think my 6900 XT is killing it self cos i can easily cause TDR by going into valheim in vulkan and playing video and it will have driver crash within 10 minutes every single time consistently with memory peeking at 7,5 ghz no driver clean install seems to fix it.
Factory reset chosen standard profile not set framerate target and vsync on this time and did not set radeon sharpening globally or in valheim no driver crash so far for 20 minutes, pci-e set to 3.0, im probably gonna try set my usual settings except pci-e 3.0 and see if i can cause it to crash again and try narrow down exact reason it crashes, cos if never had crashes while multi tasking and playing video before, except synology surveilance station live view consistently crashes driver almost even without playing games.
Crashing with pci-e 4.0 and far less or not with 3.0


----------



## Liviu Cojocaru (Jul 6, 2021)

Hey all, got my MSI Gaming Trio X 6900 XT yesterday. I had a Sapphire Nitro + OC 6700XT briefly but before these I only had Nvidia cards for the past 10 years. Is it worth it to OC this card? I use this with my Samsung Odyssey G9 5120x1440, gaming a few hours a day. 

This is what I have done for now:





Please let me know what you think  Thank you


----------



## Felix123BU (Jul 6, 2021)

Liviu Cojocaru said:


> Hey all, got my MSI Gaming Trio X 6900 XT yesterday. I had a Sapphire Nitro + OC 6700XT briefly but before these I only had Nvidia cards for the past 10 years. Is it worth it to OC this card? I use this with my Samsung Odyssey G9 5120x1440, gaming a few hours a day.
> 
> This is what I have done for now:
> 
> ...


Welcome to the club   

First of all, I would suggest that you use the Amd Software (performance -> tuning) utility, I heard that some people had issues with Afterburner, aka causing weird glitches, might or might not be the case for you. I know that coming from years of using Nvidia cards, Afterburner is the tool to go to, but for AMD cards their own software is in some ways more flexible for OC.

Secondly, for the 6000 series, if you want to OC it and play a bit with it, there are some simple rules for OC-ing with the Amd software, as in, setting a minimal and maximal frequency with a difference of 100mhz between them (for example 2500mhz max, 2400mhz min), and for starters always keep the voltage at is max setting to test the OC. As for memory, my advice would be to not go over 2100mhz, it seems that that is the max it can run stable and without errors, you might get it to run above 2100mhz, but that will most likely cause performance degradation.

One last thing, you can actually increase the power allowance over what Amd allow with the MorePowerTool from Igors Lab if your cooling can keep up with the extra power, can give some nice extra performance boost.

Enjoy your new toy


----------



## Kabouter Plop (Jul 6, 2021)

When i auto overclock memory it sets it to 2150 altho if just ran my 6900 XT on rage profile for 24/7 use


----------



## turbogear (Jul 6, 2021)

Kabouter Plop said:


> When i auto overclock memory it sets it to 2150 altho if just ran my 6900 XT on rage profile for 24/7 use


Auto overclock of memory is misleading unfortunately.
I tried if myself. It set mine to 2180MHz but my card actually doesn't work on that.
The FPS actually drops.  

My card actually degrade performance if I even OC memory to 2050MHz.

I saw the videos from Buildings who measured voltages with oscilloscope and according to him one of the reasons is that unfortunately this time around AMD didn't do good power supply filtering on reference design. 

Most of non reference cards are actually copying reference design and extending it with some phases but they didn't improve on power filtering.

Buildzoid added many capacitors and was able to reach stably 2180MHz on memory of 6900XTU Red Devil Ultimate.

According to him, giving more voltage to memory does not help.
He tried it with EVC2 I2C hardware mode to increase the voltage of VRAM and also to reduced the voltage but best results he got was by just filtering the VRAM power supply with more capacitors rather than increasing memory voltage.


----------



## Kabouter Plop (Jul 6, 2021)

degrading performance probably cos of ECC correcting errors due to high overclock


----------



## turbogear (Jul 6, 2021)

Kabouter Plop said:


> degrading performance probably cos of ECC correcting errors due to high overclock


Degrading performance for me even for 50MHz OC.  

If voltage is not clean when you OC then the voltage ripple increases because of higher current consumption that gives even more noise on power supply which leads to more noise on data lines and that can cause more ECC corrections.


----------



## Kabouter Plop (Jul 6, 2021)

Anyone here playing Valheim with vulkan on ?


----------



## Garlic (Jul 7, 2021)

Is there anyway to increase the gpu voltage above the limit in radeon software? I tried with mpt and it didn't really work.


----------



## turbogear (Jul 7, 2021)

Garlic said:


> Is there anyway to increase the gpu voltage above the limit in radeon software? I tried with mpt and it didn't really work.


No. The only way I know to do that would be to use a hardware mode with EVC2 I2C controller as Buildzoid does in the videos that I shared some posts above. 
There is a pin header on the GPU as Buildzoid showed in his videos to solder the EVC2 controller on them.


----------



## kiddagoat (Jul 7, 2021)

@turbogear since you have owned both the 6800XT and the 6900XT, if you could get the 6900XT for only 200 Euros more, would it be worth it?   

My local Microcenter has an open box 6900XT Liquid Devil for about $1900USD and 6800XT Liquid Devil for $1650USD....


----------



## Felix123BU (Jul 7, 2021)

played a bit more Time Spy, got to no.1 in the leaderboard for a combo of 6800 XT and a Ryzen 5800X 

AMD Radeon RX 6800 XT video card benchmark result - AMD Ryzen 7 5800X,Gigabyte Technology Co., Ltd. B550 AORUS PRO (3dmark.com)

Only had to pump 370w through the card for this score  6800 XT reference FTW


----------



## turbogear (Jul 7, 2021)

kiddagoat said:


> @turbogear since you have owned both the 6800XT and the 6900XT, if you could get the 6900XT for only 200 Euros more, would it be worth it?
> 
> My local Microcenter has an open box 6900XT Liquid Devil for about $1900USD and 6800XT Liquid Devil for $1650USD....


That's a hard decision and not easy to advice.
If you get a 6800XT sample that has plenty of headroom for OC then the difference is not that big to many of the 6900XTs.

Here are my experience with 5 RDNA2 cards. 
My first XFX 6800XT reference card which I  water cooled was an OC monster and was not far from two 6900XTs that I had in my hand (Powercolor 6900XT Red Devil and Sapphire 6900XT Toxic).
Both of these 6900XTs did not OC higher than 2620MHz where as my 6800XT was OCed to 2750MHz.
Therefore these both had about 5% more performance but consumed about 70W more to produce this 5% more performance.

After my XFX reference 6800XT got damaged with ESD, I bought 6800XT Red Devil but this was a disaster card. It did not OC higher than 2400MHz. If I pushed it to more, it would artifact.  I returned that back.
After this I went to 6900XT Liquid Devil Ultimate which was then about 10-12% faster than my old XFX 6800XT as it could OC to 2750MHz as well.
Here also the drawback is that my Liquid Devil does not OC at all on memory.

So as a conclusion, it is really hard to advice as all depends on luck. 
If my XFX 6800XT would not have got damaged, I would have kept it and did not go for 6900XT Liquid Devil Ultimate as that was really a great card that OCed to 2750MHz with max power setting of 325W on GPU Core where as my Liquid Devil is set to 415W GPU core for 2750MHz.


----------



## Felix123BU (Jul 7, 2021)

turbogear said:


> That's a hard decision and not easy to advice.
> If you get a 6800XT sample that has plenty of headroom for OC then the difference is not that big to many of the 6900XTs.
> 
> Here are my experience with 5 RDNA2 cards.
> ...


Very much agree with all what Turbogear said, one thing though, for only 200 EUR difference I would get the 6900XT, the normal difference between MSRP's would be 350 EUR, so 200 difference is nice  

And getting the 6900 XT would also basically negate the chance of getting a crappy clocking 6800XT or a crappy clocking 6900XT, even a crappy clocking 6900 XT would be good enough vs a excellent clocking 6800 XT.


----------



## turbogear (Jul 7, 2021)

Felix123BU said:


> played a bit more Time Spy, got to no.1 in the leaderboard for a combo of 6800 XT and a Ryzen 5800X
> 
> AMD Radeon RX 6800 XT video card benchmark result - AMD Ryzen 7 5800X,Gigabyte Technology Co., Ltd. B550 AORUS PRO (3dmark.com)
> 
> ...


WOW, you are going to beat my Liquid Devil if you keep pushing. 
What magic did you do? 

I assume you tuned the memory and CPU. 
I have seen post on Guru3D where @spajdrEX got that high score by tweaking his system although the 6800XT in itself is not pushed that much.








						RDNA2 RX6000 Series Owners Thread, Tests, Mods, BIOS & Tweaks !
					

Just a stupid question from me. :p Did you reboot the PC after applying MPT? PC need to be re-booted for new settings to work.  Yeah, just logged in to...




					forums.guru3d.com
				











						RDNA2 RX6000 Series Owners Thread, Tests, Mods, BIOS & Tweaks !
					

Just a stupid question from me. :p Did you reboot the PC after applying MPT? PC need to be re-booted for new settings to work.  Yeah, just logged in to...




					forums.guru3d.com
				






Felix123BU said:


> Very much agree with all what Turbogear said, one thing though, for only 200 EUR difference I would get the 6900XT, the normal difference between MSRP's would be 350 EUR, so 200 difference is nice
> 
> And getting the 6900 XT would also basically negate the chance of getting a crappy clocking 6800XT or a crappy clocking 6900XT, even a crappy clocking 6900 XT would be good enough vs a excellent clocking 6800 XT.


Yes I agree to what you said.


----------



## Felix123BU (Jul 7, 2021)

turbogear said:


> WOW, you are going to beat my Liquid Devil if you keep pushing.
> What magic did you do?
> 
> I assume you tuned the memory and CPU.
> ...


Did not change a lot of things on the CPU, except putting LM on it too, cause why not    and tuning the system memory as much as I could, but its not the greatest kit, can only go 3800mhz 1:1:1 CL16, that for sure does not help the scores. A 36000mhz CL14 kit tuned to 3800+mhz CL14 would surely improve the scores.

On the GPU, I basically pushed it as hard as it can go (2735mhz max core, 2138 max mem), ok, I could allow more power trough it for some extra points, but 370w on a reference card is already too much, though I can bet it can take a couple of runs with 400w, but for now my 3D Mark itch is scratched


----------



## Space Lynx (Jul 7, 2021)

Felix123BU said:


> Did not change a lot of things on the CPU, except putting LM on it too, cause why not    and tuning the system memory as much as I could, but its not the greatest kit, can only go 3800mhz 1:1:1 CL16, that for sure does not help the scores. A 36000mhz CL14 kit tuned to 3800+mhz CL14 would surely improve the scores.
> 
> On the GPU, I basically pushed it as hard as it can go (2735mhz max core, 2138 max mem), ok, I could allow more power trough it for some extra points, but 370w on a reference card is already too much, though I can bet it can take a couple of runs with 400w, but for now my 3D Mark itch is scratched



tightened subtimings seem to be more important than the tightening of the 14-14-14-14  first four main timings.  (not sure if worded it right but you get my point)


----------



## turbogear (Jul 7, 2021)

Felix123BU said:


> Did not change a lot of things on the CPU, except putting LM on it too, cause why not    and tuning the system memory as much as I could, but its not the greatest kit, can only go 3800mhz 1:1:1 CL16, that for sure does not help the scores. A 36000mhz CL14 kit tuned to 3800+mhz CL14 would surely improve the scores.
> 
> On the GPU, I basically pushed it as hard as it can go (2735mhz max core, 2138 max mem), ok, I could allow more power trough it for some extra points, but 370w on a reference card is already too much, though I can bet it can take a couple of runs with 400w, but for now my 3D Mark itch is scratched


 You are the king of Time Spy on 6800XTs.


----------



## Felix123BU (Jul 7, 2021)

lynx29 said:


> tightened subtimings seem to be more important than the tightening of the 14-14-14-14  first four main timings.  (not sure if worded it right but you get my point)


I did that, tightened them a lot, but the main ones don't go below 16 no matter what voltage or other settings I apply at 3800mhz, it was never a top performance ram kit


----------



## Butanding1987 (Jul 7, 2021)

turbogear said:


> … where as my Liquid Devil is set to 415W GPU core for 2750MHz.


Time to replace that Liquid Devil Ultimate with the Asrock OC Formula, which has better filtering. 



Felix123BU said:


> I did that, tightened them a lot, but the main ones don't go below 16 no matter what voltage or other settings I apply at 3800mhz, it was never a top performance ram kit


Time to replace that 5800x with a 5950x and that RAM with a 3800 CL14 kit.


----------



## Felix123BU (Jul 8, 2021)

Butanding1987 said:


> Time to replace that Liquid Devil Ultimate with the Asrock OC Formula, which has better filtering.
> 
> 
> Time to replace that 5800x with a 5950x and that RAM with a 3800 CL14 kit.


Don't tempt me   but honestly, not gonna happen, that cash is reserved for a DDR 5 system, when DDR 5 wont be crap


----------



## turbogear (Jul 8, 2021)

Butanding1987 said:


> Time to replace that Liquid Devil Ultimate with the Asrock OC Formula, which has better filtering.


Better supply filter but crappier cooling system.  
Become limited by cooling rather than with filtering of supply. 

Buy OC Formula and design my own water block. 
I am packing out my Dremel machine. A big chunk of copper is on it's way.  

Really tempting but I think I will stick to liquid devil until RDNA3 arrives.

To be honest it seems not only filtering that prevents memory to OC well but I think also memory chips themselves. Buildzoid only gained 30MHz with his complicated capacitor mode. 

As Buildzoid mentioned in his videos, there are better memory chips with 18Gbps that AMD is going to use on reference XTXH cards which go to system builders but these are not used in other RDNA2 cards.









						AMD Radeon RX 6900 XT LC Specs Confirmed: 12% Higher Core and Memory Clocks, 10% Higher Power
					

A leaked SKU sheet posted to Twitter by VideoCardz confirmed that AMD is indeed positioning the Radeon RX 6900 XT Liquid Cooled (LC) as a SKU separate from the RX 6900 XT. The finalized specs reveal that the card features Game Clocks of 2250 MHz, compared to 2015 MHz of the stock (air-cooled...




					www.techpowerup.com
				




By the way, those refernce LC cards are going to be limited by thermal design. Using single 120mm radiator wouldn't allow to push it anywhere so high as the hotspot temperatures will go towards 100°C if one pushes for example 420W GPU core power.

Imagine (dream) of a 6900XT custom liquid cooled design with XTXH chip and those 18Gbps memory and a better supply filter design.  Those cards could be breaking the records. 
Maybe AMD is waiting to see if Nvidia will launch 3090 refresh before unleaching such a monster.


----------



## Felix123BU (Jul 8, 2021)

turbogear said:


> Better supply filter but crappier cooling system.
> Become limited by cooling rather than with filtering of supply.
> 
> Buy OC Formula and design my own water block.
> ...


Seems legit regarding the faster memory AMD 6900's, every 10-20 extra Mhz provides a boost in FPS, I am still dreaming of a 6000 series with either HMB2e or DDR6X, those would be killers


----------



## Djtbster (Jul 8, 2021)

Hi guys, has anyone put a universal waterblock on a 6000 series card yet?


----------



## turbogear (Jul 8, 2021)

Djtbster said:


> Hi guys, has anyone put a universal waterblock on a 6000 series card yet?


Sorry, I only used the EKWB block which is specifically made for 6800/6900 reference cards. 
I have not used any universal ones.

EK Water block for RX 6800 and RX 6900 RDNA2 GPUs – EK Webshop (ekwb.com)


----------



## Butanding1987 (Jul 9, 2021)

I finally made it.  I managed to raise the graphics performance by a notch but I think it was the CPU overclock that carried it to No. 1.


----------



## turbogear (Jul 9, 2021)

Butanding1987 said:


> I finally made it. I managed to raise the graphics performance by a notch but I think it was the CPU overclock that carried it to No. 1.
> 
> View attachment 207153
> View attachment 207154


It seems @Felix123BU motivated you to push further. 

I need to start doing some meditation to not get influenced by you two.


----------



## Felix123BU (Jul 9, 2021)

Butanding1987 said:


> I finally made it.  I managed to raise the graphics performance by a notch but I think it was the CPU overclock that carried it to No. 1.
> 
> View attachment 207153
> View attachment 207154


Game on, lets do battle    Whomever wins gets to be the closest to frying his card 



turbogear said:


> It seems @Felix123BU motivated you to push further.
> 
> I need to start doing some meditation to not get influenced by you two.


You know you can not stand on the sides, you must join the competition


----------



## Kovski (Jul 10, 2021)

Is the artifacting in Cold War/Battlefield with the screen reflections ever going to be fixed or is there at least a workaround? I bought the 6800xt a week ago and it's really frustrating as these are the game I play the most and it's really obvious..


----------



## turbogear (Jul 11, 2021)

Kovski said:


> Is the artifacting in Cold War/Battlefield with the screen reflections ever going to be fixed or is there at least a workaround? I bought the 6800xt a week ago and it's really frustrating as these are the game I play the most and it's really obvious..


Welcome to the RDNA2 owner's thread as well as welcome to TPU.  

Sorry I did not play BFV since long time anymore and I don't own Cold War.
Maybe somebody else here can confirm if they see same issue or know of workaround.
I recommend reporting it to AMD. You can report bugs through Radeon Software.

One of the TPU members @INSTG8R is in AMD Vanguard Beta Tester program maybe he knows more about this and can recommend a possible workaround.


----------



## GamerGuy (Jul 11, 2021)

I'd played the SP Campaign mode as I'm a SP guy, I had not noticed any weird artifacting when I played the game. I believe I ran the game fully maxed out at 3840x1080.


----------



## Kabouter Plop (Jul 17, 2021)

Getting 19k almost in timespy with rage profile and 10k+ with port royal but timespy crashes a lot now it always was problematic before crashing but now gpu driver even tho rarely crashes to especially when benchmark is done or exits and screen flickers, im very worried gpu was a doa a dud i have higher expectations honestly for stability even when overclocking it or running preset profile like rage profile.


----------



## GerKNG (Jul 17, 2021)

22k in timespy with MPT is no problem even with a nitro+ (awful heatsink)

but damn these cards run into insane powerlimits even at over 350W.
the difference between stock undervolting (1081mv get, 264W) and 1175mv get (1050 set) and 2700Mhz target at 365W) is less than 50Mhz.


----------



## Kabouter Plop (Jul 17, 2021)

I just realized when i installed waterblock i skiped 3 thermal pads cos there no chips there thinking there simply no need but now i think it may be shorting there which is why thermal pads is suggested






I wonder if i could get away squeezing thru a thermal pad at the exact spot somehow without disassembly to stop it from shorting there see if it adresses timespy stability, altho i have no crashes in other games other then Valheim in Vulkan which is clearly game fault.
I may just drain the loop tommorow and check anyway and add those pads anyway to be done i dont remember if timespy also crashed before i put on waterblock.




Forgot to show the block bottom left is where there no thermal pad bassicly anywhere else i put thermal pad each spot even had hotspot temps at some point of 66c even 69c now it does not exceed 62c with mx4


----------



## Butanding1987 (Jul 18, 2021)

GerKNG said:


> 22k in timespy with MPT is no problem even with a nitro+ (awful heatsink)
> 
> but damn these cards run into insane powerlimits even at over 350W.
> the difference between stock undervolting (1081mv get, 264W) and 1175mv get (1050 set) and 2700Mhz target at 365W) is less than 50Mhz.


Your Time Spy graphics score is relatively low for an overclocked 6900 XT. My 6800 XT can achieve more than that.

EDIT:
Oh, you’re on air. That explains it.


----------



## HOFF (Jul 20, 2021)

I built a new core system in the fall with a 5800x and was using my old 1080Ti until I joined the club last week, upgrading to the ASUS TUF 6800xt. Quite happy with this GPU. I've been tinkering with OC settings for the past week and finally landed on something I'm happy with in terms of balancing performance boost, stability and temps. Just figured out the whole mem limit thing today.. 2100mhz fast timings was causing performance issues, as illustrated in the stress test screenshots where core freq would randomly dip to 0 (disregard low temps in the cap on 2.1ghz, just caught it on a dip). I haven't tried lower voltage or higher clocks, could probably push this a lot further, but for now it's fine.

Only "weak spot" is my RAM, will upgrade to a 32gb b-die kit at some point but got this set for free, so it's fine for now. At least I can ring the neck of this kit, happy with the OC I got on em.

*My system specs:*
Case: Corsair iCUE 4000x RGB
CPU: AMD Ryzen 7 5800x PBO Optimized 5GHz, curve -5 on 2 best cores -10 everything else, FCLK 1900mhz
Motherboard: MSI x570 Tomahawk
Cooler: Corsair H115i PRO RGB
RAM: Corsair Vengeance LPX 16GB @ 3800Mhz CL16-19-16-36-60-1T
Video: ASUS TUF Gaming OC AMD Radeon 6800 XT
Power: Corsair RM750x
SSD 1: Sabrent Rocket PCIe 4.0 NVMe 2TB
SSD 2: Samsung 850 EVO 500GB
HDD: WD Black 1TB
Monitor: MSI Optix MAG274QRF-QD 1440p 165hz IPS
Monitor 2: Acer 24" 1080p 75hz IPS


*Overall GPU settings*




*Stress test consistency mem 2.1ghz vs 2.076ghz*







*Setup pics*


----------



## HOFF (Jul 20, 2021)

Kovski said:


> Is the artifacting in Cold War/Battlefield with the screen reflections ever going to be fixed or is there at least a workaround? I bought the 6800xt a week ago and it's really frustrating as these are the game I play the most and it's really obvious..


Turn post processing in bf to medium, solves the issue. There is a command prompt thing you can set as well, but you have to manually do it every time you start the game. I found a reddit post that discovered that even with ray tracing set to OFF, the game is applying very low quality ray tracing to reflections when Post is above medium, which is why they look like garbage.

It's not really an AMD thing, and based on this behaviour I wouldn't be surprised if nvidia played a part in how the game is configured to make it look like crap on amd cards when "maxed out"

First timespy demo test with my above settings before I start pushing things




bumped to 2450min/2550max




2500min/2600max, fans pretty much maxed which helped my cpu score


----------



## Butanding1987 (Jul 21, 2021)

Looks fine. Try max 2620MHz. Interval can be 200MHz for extra stability.


----------



## Valantar (Jul 21, 2021)

Hey, so finally I can join this thread! Got my RX 6900 Liquid Devil Ultimate installed a few days back. Getting that beast of a card into the Meshlicious wasn't exactly easy, but it worked out fine (even if it's 163mm according to PC and the Meshy is listed at 155mm max according to SSUPD). Next up is some undervoling. I'll be going back through this thread and reading some more, but in case someone wants to help me out, is there a good UV guide out there? I can't seem to find anything much, and my early attempts from scratch didn't do anything much (unstable at pretty much any setting).

I'm impressed that my loop can keep it acceptably cool at stock (60°C max when playing an hour of Metro Exodus last night, 1440p RT Ultra, with ~28°C ambient temps), but some undervolting and a light underclock is probably a good idea - a 330+W GPU plus a ~120W CPU with just a single 280mm rad is ... a stretch. Fluid temps peaked at 44.5°C, which is passable, but ... yeah. I'd like to get it down to 250W or less. I can gladly sacrifice a bit of performance for that.

Here's a quick pic of the GPU installed. The tubing is a mess, I'll be swapping inlet and outlet ports to clean it up (both EK and PowerColor confirm that this should not affect performance), as the CPU outlet comes in from the top. Also had to remove the top plastic RGB thingy on the port block to make it fit, which ins't exactly much of a sacrifice.




And yes, the peels have since been removed.

Stock performance is pretty decent though:








That's straight OOTB, no settings touched. My CPU is running a slightly restrictive PPT/TDC/EDC config, so it's a bit slower than stock.


----------



## turbogear (Jul 21, 2021)

Felix123BU said:


> Game on, lets do battle   Whomever wins gets to be the closest to frying his card
> 
> 
> You know you can not stand on the sides, you must join the competition



@Felix123BU @Butanding1987
So boys it is time to strike back with my best personal score ever on the 6900XT Liquid Devil Ultimate. 
I was able to break the 23K barrier. 

Wait until I OC my DDR memory. The score is expected to increase. 
This score is with default setting on G.Skill F4-3600C14-8GTZNB CL14 Samsung B-Die kit.





This score is rank 11 in the best graphic scores for 6900XT and overall 64 for this processor combined score.








						I scored 20 731 in Time Spy
					

AMD Ryzen 9 5900X, AMD Radeon RX 6900 XT x 1, 16384 MB, 64-bit Windows 10}




					www.3dmark.com
				







You will be thinking about how the hell did I achieve that now after I almost gave up on tuning this card. 
My problem was that I could not OC VRAM at all.
The solution came from Hardwareluxx forum.

The bios for the new the AMD Reference Liquid-Cooled 6900XT became available.








						Reference Liquid-Cooled Radeon RX 6900 XT Listed, Possibly the RX 6900 XTX, with Faster Memory
					

To preempt NVIDIA's GeForce RTX 3080 Ti launch, AMD had worked with its partners to release a refreshed Radeon RX 6900 XT based on a swanky new ASIC internally dubbed "XTXH." This is essentially the highest bin of the "Navi 21" silicon that allows 10% higher clock speeds over the standard RX...




					www.techpowerup.com
				




This card is running at 18Gbps memory bandwidth compared to 16Gbps on all other 6900XT cards.
As people found out is that AMD is using similar memory chips like on standard 6900XT but they are applying different setting to OC the memory.
Plus maybe higher binned memory like we have higher binned XTXH cards.

I copied the memory voltage setting from this card and applied them to mine using the *new MPT v1.3.7 Beta 4*.
This new version of MPT supports modifying the voltage of VRAM as shown in screen shot provided below.

I bumped VDDCI DPM3 from default of 850mV-->900mV and MVDD DPM3 from default of 1350mV-->1400mV.
These two changes stabilized my memory and I can now OC the VRAM to 2140MHz. 
Before doing this, even 2020MHz was unstable and I had crash when running benchmarks.

In theory one can go even higher. People tried up to 1500mV at the MVDD DPM3, but one need to be careful.
Increasing this voltage increases the temperature on the memory.
For me the memory junction temperature went up by 3°C from 47°C to 50°C which I checked on HWiNFO.
I have water cooled card. So on non water cooled ones this could get out of control very quickly.






Here are the setting from Radeon SW.


Power setting applied taking into account MPT values and 15% setting in the Radeon Driver:






Link to the new AMD Reference XTXH-LC can be found here.








						[Sammelthread] - Offizieller  AMD  [[ RX6700 // RX 6700XT // X6800 // 6800XT // 6900XT ]] Overclocking und Modding Thread  [[ Wakü - Lukü - LN2]]
					

Meiner Meinung nach war das erstmal der Konter auf die TI Karten . Ich persönlich denke ,daß AMD schon beim Launch über die Leistung Bescheid wußte .Die 6800non ,ist ein gutes Beispiel  : die Begrenzung der Volt von 1050 auf 1025 . Die Leistung wird halt per Treiber stückweise erhöht .  Manchmal...




					www.hardwareluxx.de
				




Note: I don't flash this bios onto my card. I only read the setting using MPT and change the voltage on my card.
Some Hardwareluxx members flashed it by using special external programer. 
Two guys made it work that way. One on Asrock 6900XT OC Formula and other one on 6900XT Liquid Devil Ultimate.
The usual software flashing tools don't work for this.


----------



## Butanding1987 (Jul 22, 2021)

turbogear said:


> @Felix123BU @Butanding1987
> So boys it is time to strike back with my best personal score ever on the 6900XT Liquid Devil Ultimate.
> I was able to break the 23K barrier.
> 
> ...


Nice! I should try that on my 6800 XT VRAM.  I think the Samsung chips on the 6900 XT and 6800 XT are the same?
Your CPU score is kind of low and pulled your total score down. It’s time to overclock that 5900X to death.


----------



## GerKNG (Jul 22, 2021)

Butanding1987 said:


> Your Time Spy graphics score is relatively low for an overclocked 6900 XT. My 6800 XT can achieve more than that.
> 
> EDIT:
> Oh, you’re on air. That explains it.


the world record with a 6800XT is not even 23k


----------



## Butanding1987 (Jul 22, 2021)

GerKNG said:


> the world record with a 6800XT is not even 23k


It's 66 points short of 23,000, which  is amazing. More is expected from your 6900 XT. We're rallying behind you.


----------



## turbogear (Jul 22, 2021)

Butanding1987 said:


> Nice! I should try that on my 6800 XT VRAM.  I think the Samsung chips on the 6900 XT and 6800 XT are the same?
> Your CPU score is kind of low and pulled your total score down. It’s time to overclock that 5900X to death.


Yes the VRAM are the same on both.
You can try it but seems like not helping always.
Especially if the memory is already able to OC high without it.
On Hardwareluxx only few people saw performance gain.

I  need to work on my CPU and memory.
CPU is curve optimized only and not extra OCed at the moment.
Same with DDR. It is running at XMP only.



Valantar said:


> Hey, so finally I can join this thread! Got my RX 6900 Liquid Devil Ultimate installed a few days back. Getting that beast of a card into the Meshlicious wasn't exactly easy, but it worked out fine (even if it's 163mm according to PC and the Meshy is listed at 155mm max according to SSUPD). Next up is some undervoling. I'll be going back through this thread and reading some more, but in case someone wants to help me out, is there a good UV guide out there? I can't seem to find anything much, and my early attempts from scratch didn't do anything much (unstable at pretty much any setting).
> 
> I'm impressed that my loop can keep it acceptably cool at stock (60°C max when playing an hour of Metro Exodus last night, 1440p RT Ultra, with ~28°C ambient temps), but some undervolting and a light underclock is probably a good idea - a 330+W GPU plus a ~120W CPU with just a single 280mm rad is ... a stretch. Fluid temps peaked at 44.5°C, which is passable, but ... yeah. I'd like to get it down to 250W or less. I can gladly sacrifice a bit of performance for that.
> 
> ...


Welcome to the club. 
I thought you are looking to OC your Liquid Devil to squeeze out all the performance that is possible, but you want to actually undervolt and under clock it which will decrease the performance.

I did not work in that direction. I usually try to find the lowest voltage where the maximum possible clock for GPU is possible.
For example,  my Liquid Devil has highest stable Time Spy GPU clock setting of 2750MHz with minimum frequency set to 2550MHz.  For this setting I started at 1200mV and decreased in 10-15mV steps the voltage running Time Spy  benchmark after every step until the point where Time Spy started to crash.
So for 2750MHz clock, the minimum stable voltage for me is 1155mV.
After I find this setting, I run Stability test like Fire Strike Extreme Stress Test to check if these settings are stable.
It is time consuming process to find correct settings. 
Every GPU is individual and need to be tuned separately.

I found one guide on how to OC, but it is in German. You can google translate it: 








						[Guide] - Navi 21 Max Overclocking Tutorial [6800 XT / 69X0 XT]
					

Wer wissen will, was die eigene Navi 21 Karte wirklich kann, aber nicht weiß, wie man das anstellt, der ist hier richtig. Ein Typischer Fall ist dieser: Karte gekauft und jetzt läuft die viel langsamer als bei den großen Jungs im Luxx Forum. Was tun?   Inhaltsverzeichnis 1. Time Spy: das (fast)...




					www.hardwareluxx.de


----------



## Valantar (Jul 22, 2021)

turbogear said:


> Welcome to the club.
> I thought you are looking to OC your Liquid Devil to squeeze out all the performance that is possible, but you want to actually undervolt and under clock it which will decrease the performance.
> 
> I did not work in that direction. I usually try to find the lowest voltage where the maximum possible clock for GPU is possible.
> ...


Thanks! I know it's a bit of a shame to not try to push this monster of a GPU as far as it can go, but tbh I only bought it because of availability, the included water block and running out of patience. I really wanted a 6800 XT, but gave up after waiting months and months. Cooling an OC'd version of the LDU in my setup simply isn't feasible, and I do like the idea of maximising efficiency - RDNA2 can run very cool if tuned properly after all. Thanks for linking to that guide  - from a quick skim it looks very informative. I was mainly confused about the min/max clock settings and how to treat them, seeing how the stock setting has a huge range but most OCs seem to only space them out about 200MHz. If that's the recommended/best practice I'll just stick to that. The voltage guidelines in the guide were also really informative.

Anyhow, I'll try my hand at some tuning and post my results here. Hopefully my chip isn't a complete dud when it comes to undervolting


----------



## turbogear (Jul 22, 2021)

Valantar said:


> Thanks! I know it's a bit of a shame to not try to push this monster of a GPU as far as it can go, but tbh I only bought it because of availability, the included water block and running out of patience. I really wanted a 6800 XT, but gave up after waiting months and months. Cooling an OC'd version of the LDU in my setup simply isn't feasible, and I do like the idea of maximising efficiency - RDNA2 can run very cool if tuned properly after all. Thanks for linking to that guide  - from a quick skim it looks very informative. I was mainly confused about the min/max clock settings and how to treat them, seeing how the stock setting has a huge range but most OCs seem to only space them out about 200MHz. If that's the recommended/best practice I'll just stick to that. The voltage guidelines in the guide were also really informative.
> 
> Anyhow, I'll try my hand at some tuning and post my results here. Hopefully my chip isn't a complete dud when it comes to undervolting


The offset between min and max frequency is used so that GPU is forced to not go to idle mode for small fractions of time during gaming and benchmarks.
Many have issues with non ideal performance when min frequency is left at 500MHz.
Buildzoid showed in his youtube channel that the boost algorithms causes for short miliseconds time GPU to go to low power mode even when you have load on it. This behavior is mitigated by increasing the min frequency to range 100-200MHz below max frequency.
The range of 200MHz is kind of sample dependent.Tighter is helpful for performance but not many 6900XT run stable for example for 100MHz difference.
On my 6800XT before I had it 100MHz but on my Liquid Devil it is not stable below 200MHz offset.

Regarding undervolt, I wish you luck. The 6900XT's don't like that much undervolt.  
I could undervolt my 6800XT to 950mV but my Liquid Devil does not like voltages below 1100mV even when I don't OC it and run it on default frequency.


----------



## Butanding1987 (Jul 22, 2021)

I just need 77 more points.


----------



## turbogear (Jul 22, 2021)

Butanding1987 said:


> I just need 77 more points.
> 
> View attachment 209245


Your CPU results are amazing. 16011 compared to my 12646. 
You have also 5900X like me.
Are you doing manual OC or that is only Curve Optimization?


----------



## Butanding1987 (Jul 22, 2021)

turbogear said:


> Your CPU results are amazing. 16011 compared to my 12646.
> You have also 5900X like me.
> Are you doing manual OC or that is only Curve Optimization?


I'm using CTR 2.1.


----------



## turbogear (Jul 22, 2021)

Butanding1987 said:


> I'm using CTR 2.1.


I tried the CTR 2.1 some time ago.
Somehow the setting that it provided for my 5900X were never stable.
I got crash as soon as I started Cinebench benchmark after applying these settings. 
I did not try manually to tune the CTR setting but went afterwards and started using Curve Optimizer.

I used CTR 2.0 on 5800X before without issues.

There is also Dynamic OC Switcher available on my ASUS Crosshair VIII Dark Hero. It does something similar to CTR.
I have not tested that until now.
I guess I need to learn a little about that to get more performance.


----------



## Valantar (Jul 22, 2021)

turbogear said:


> The offset between min and max frequency is used so that GPU is forced to not go to idle mode for small fractions of time during gaming and benchmarks.
> Many have issues with non ideal performance when min frequency is left at 500MHz.
> Buildzoid showed in his youtube channel that the boost algorithms causes for short miliseconds time GPU to go to low power mode even when you have load on it. This behavior is mitigated by increasing the min frequency to range 100-200MHz below max frequency.
> The range of 200MHz is kind of sample dependent.Tighter is helpful for performance but not many 6900XT run stable for example for 100MHz difference.
> ...


Ah, that sounds sensible. I briefly tried some quick UV attempts two days ago, but left the min frequency at 500, and couldn't really get anything at all stable. I'll have to play around with it a bit - the LDU does start at some pretty high stock clocks (500/2614, 1200mV is what Manual OC in wattman gives me, at least), so I can understand that it takes some underclocking to reduce the voltage noticeably. I tried the auto UV option in wattman, which resulted in 1175mV, which ... well, that's not very impressive  Still, playing on a 1440p60 monitor for now (or my secondary 1080p75 one if 60Hz gets on my nerves) means I can safely sacrifice some performance without noticing. We'll see if I have to start adding radiators once I upgrade my main monitor  I don't have any hard goals, but if I can get it down to 250W while maintaining 2.2-2.3GHz I'd be happy with that. Who knows, maybe I'll go for an extreme UV/downclock profile and see just how efficient I can make it 



Butanding1987 said:


> I just need 77 more points.
> 
> View attachment 209245


Wow, that's close - 0.00355% behind! Makes me wonderr if 77 points could be within run-to-run variance? Unless your current result is a peak among many runs, of course. Which I guess it might be, at that.


----------



## GamerGuy (Jul 22, 2021)

Awesome!  Shite just got real!  Lemme grab a bag of popcorn before you guys throw down the gauntlets and duke it out!


----------



## Butanding1987 (Jul 22, 2021)

I'm tired of Time Spy. Let's shift to Fire Strike Ultra. If only I could gain one more point. 





I did it! LOL. Sorry for trolling the forum.  






turbogear said:


> I tried the CTR 2.1 some time ago.
> Somehow the setting that it provided for my 5900X were never stable.
> I got crash as soon as I started Cinebench benchmark after applying these settings.
> I did not try manually to tune the CTR setting but went afterwards and started using Curve Optimizer.
> ...


The default values that CTR 2.1 gives are unreliable. You still have to tweak them. But once you get the right voltage and frequencies, it will pay off.


----------



## Felix123BU (Jul 22, 2021)

Butanding1987 said:


> I'm tired of Time Spy. Let's shift to Fire Strike Ultra. If only I could gain one more point.
> View attachment 209270
> 
> I did it! LOL. Sorry for trolling the forum.
> ...


Hmm, in Firestrike Ultra I lag you graphics score by 446 points (14523 for me), maybe doable, let me see, in the mood to play some 3D mark  

One thing I cant reach is your Time Spy Extreme Graphics score, that's way off for me for now 

Also, what driver version are you using right now?

Update 1 --- ALL HAIL THE MIGHTY 6800XT's  

@Butanding1987 , beat your Firestrike Ultra Graphics score by .... 18 points , the interesting part is that I needed a full core CPU OC to get to this score, else I was 50 points below constantly. Second, found out you can not go higher than 2700mhz on the minimum GPU clocks 

The sweet part is we have between the fastest 6800XTs on the planet, that's cool if you think about it 





Lets see now if I can get close to your TimeSpy scores, in Firestrike my 6800XT plays along great, less so in TimeSpy, but lets see


----------



## Butanding1987 (Jul 22, 2021)

Felix123BU said:


> Hmm, in Firestrike Ultra I lag you graphics score by 446 points (14523 for me), maybe doable, let me see, in the mood to play some 3D mark
> 
> One thing I cant reach is your Time Spy Extreme Graphics score, that's way off for me for now
> 
> ...


That's the  spirit.  
I topped that Fire Strike score a few hours hours ago (No. 2 now after ViKTOR) but I'm reluctant to post it because I think I can still improve it.  
If I'm not mistaken, I now  have the top graphics score (6800 XT across all CPUs) in Time Spy, Time Spy Extreme and Fire Strike. Still working on Fire Strike Extreme  

PS. I’m using the latest driver.

I've managed to break the 23k level of Time Spy's graphics score.


----------



## turbogear (Jul 23, 2021)

@Felix123BU @Butanding1987 not bad. 
You are the kings of 6800XT. 

This reminds me of my super 6800XT card. That card was faster than yours on similar settings and drivers at that time. 
R.I.P. my brave 6800XT in the graphic cards heaven.


----------



## Butanding1987 (Jul 23, 2021)

turbogear said:


> @Felix123BU @Butanding1987 not bad.
> You are the kings of 6800XT.
> 
> This reminds me of my super 6800XT card. That card was faster than yours on similar settings and drivers at that time.
> R.I.P. my brave 6800XT in the graphic cards heaven.


I have a feeling I won't be king for long.


----------



## Felix123BU (Jul 23, 2021)

Butanding1987 said:


> That's the  spirit.
> I topped that Fire Strike score a few hours hours ago (No. 2 now after ViKTOR) but I'm reluctant to post it because I think I can still improve it.
> If I'm not mistaken, I now  have the top graphics score (6800 XT across all CPUs) in Time Spy, Time Spy Extreme and Fire Strike. Still working on Fire Strike Extreme
> 
> ...


Yup, congrats, you managed to get the top scores in almost any category for a 6800XT, the quality of the card counts, but so does the skill to push it that far  



Butanding1987 said:


> I have a feeling I won't be king for long.


You wont get further competition from me at this point in time, running 380w through a reference 6800XT was a nail biting experience, though I am reasonable sure it could take 400w for short bursts 

I am content in having the no.1 combo for overall score of a 6800XT and a 5800X in Time Spy Extreme and no.1 in Graphics score for this combo in Firestrike and Firestrike Ultra for now.  Maybe next time.

Has been fun, and congrats on pushing that 6800XT of yours that high


----------



## turbogear (Jul 23, 2021)

Felix123BU said:


> Yup, congrats, you managed to get the top scores in almost any category for a 6800XT, the quality of the card counts, but so does the skill to push it that far
> 
> 
> You wont get further competition from me at this point in time, running 380w through a reference 6800XT was a nail biting experience, though I am reasonable sure it could take 400w for short bursts
> ...


Yes my 6900XT Liquid Devil also seem to be on it's limit.
It does not go higher in frequency than 2750MHz setting.
I am lucky that I was able to tune VRAM at all after raising the voltage to 1400mV. It did not OC without that mode at all.

I am stil looking at it maybe I have not found the real sweet spot.
If I find any further tricks to squeeze more performance out of the GPU of course I will share and tempt you both to push further. 

There are some things for real daredevils.   
Like hardware modes with Elmor EVC2SX to raise GPU voltage or soldering capacitors to clean out voltages which gives higher frequencies. 

For me I think the real potential left is with 5900X and memory tuning.  

I have not done there much until now except curve tuning a little but there also I did not spend the time to find the last % of performance like I did on GPU. It's been on my agenda for some time.


----------



## Felix123BU (Jul 23, 2021)

turbogear said:


> Yes my 6900XT Liquid Devil also seem to be on it's limit.
> It does not go higher in frequency than 2750MHz setting.
> I am lucky that I was able to tune VRAM at all after raising the voltage to 1400mV. It did not OC without that mode at all.
> 
> ...


Are you sure that the RAM voltage changes are actually being applied? I am not 100% sure, though since raising the voltage you mentioned earlier, and thx for the info btw, I seem to be able to push memory to 2150mhz without loosing performance, so I am guessing it works. It was a long missing feature for OC-ing these cards, memory speed is holding this gen in check


----------



## turbogear (Jul 23, 2021)

Felix123BU said:


> Are you sure that the RAM voltage changes are actually being applied? I am not 100% sure, though since raising the voltage you mentioned earlier, and thx for the info btw, I seem to be able to push memory to 2150mhz without loosing performance, so I am guessing it works. It was a long missing feature for OC-ing these cards, memory speed is holding this gen in check


That is a good question.
Looking at HWINFO I don't see it applied but after doing this, I could raise my VRAM frequency from 2000MHz to 2140MHz.
Without this mode I start getting score regression already at 2020MHz.

By the way, on the MPT in frequency tab there is also FCLK (Infinity Fabric Clock) setting.
Some people are experimenting with this to get more performance.
I did not try it.


----------



## Kabouter Plop (Jul 24, 2021)

Decided to re do thermal paste just to see if i used enough, it was fine, did add the missing thermal pad just to be sure even tho there was 2mm / 3 mm space, timespy still crashes producing no result at times seems especially worse with 21.7.1 had artifacting in doom eternal as well but 2 hour vram test found no errors at all, doom eternal had a patch recently to fix artifacting in some areas, while timespy has recently fixed a bug with something memory related, im not exactly happy with my 6900 XT wish i could trade it for rtx 3080 ti at this point or rtx 3090 with waterblock installed.

No games crash either except valheim in vulkan, amd drivers are very frustating, never had to DDU a single driver with nvidia in past, and if already experienced inconsistencies due drivers that magicly go away after DDU.

Anyone else have similar issues with timespy ?, it sometimes does 20 loops no problem then every 20 loops for like few times up to 80 can sometimes pass then it randomly starts crashing producing no result consistently every 5 or 10 loops.


----------



## GerKNG (Jul 24, 2021)

Kabouter Plop said:


> No games crash either except valheim in vulkan


don't worry.. it is pretty broken since day one with vulkan (not just with AMD Cards)


----------



## Kabouter Plop (Jul 24, 2021)

GerKNG said:


> don't worry.. it is pretty broken since day one with vulkan (not just with AMD Cards)



Still worried cos i had artifacts in doom eternal and timespy crashes producing no results it freaks me out, no amd driver experience at all wish i avoided it.


----------



## GerKNG (Jul 24, 2021)

Kabouter Plop said:


> Still worried cos i had artifacts in doom eternal and timespy crashes producing no results it freaks me out, no amd driver experience at all wish i avoided it.


timespy crashes are common (for me) since day one (i have two 6900XTs)
my OC is stable in everything except some 3DMark Benchmarks.
sometimes i run a timespy stresstest for 20 minutes with zero issues, after that the benchmark crashes at the beginning of GT2.

only at stock everything works fine. (or undervolted)


----------



## Kabouter Plop (Jul 24, 2021)

Timespy has crashed at stock to maybe bit less common not sure but it was first thing i noticed in first month anyway if i am not alone with these issues then i am somewhat relieved.


----------



## GerKNG (Jul 24, 2021)

Kabouter Plop said:


> Timespy has crashed at stock to maybe bit less common not sure but it was first thing i noticed in first month anyway if i am not alone with these issues then i am somewhat relieved.


you are not alone. 
both my nitro + and 319 black crashes in timespy (sometimes)


----------



## Garlic (Jul 25, 2021)

I made my vram voltage higher (to 900mv/1400mv) and I found out that by giving the vram more voltage it somehow makes me have a higher core clock.

That aside is there anything I can do in this part of mpt? Thanks.


----------



## Felix123BU (Jul 25, 2021)

Kabouter Plop said:


> Timespy has crashed at stock to maybe bit less common not sure but it was first thing i noticed in first month anyway if i am not alone with these issues then i am somewhat relieved.


Timespy is not very friendly with OC, at stock or mild its fine, but at a more aggressive OC (that runs smooth as butter in Firestrike) it crashes quickly in Timespy. My worst performing results are in Timespy because of the instability while overclocked.

I don't really get it why you are unhappy with your card. For me this has been the most bug-free and stable experience of any of the 15+ cards I owned, AMD and Nvidia , and I have tested around 20 games, new and old, some of them I played and finished. But well, some could be unlucky, too many hardware combos out there


----------



## Kabouter Plop (Jul 25, 2021)

would not consider this fun experience, the top 2 are confirmed driver bugs the 3e one if seen another user complain about recently on reddit and game has recent patch from july 14 that fixed artifacting issues in few envoriments, before i started doom eternal i produced a driver crash that is 100% reproduceable simply by opening live view on my synology surveilance station, and the artifacts usually appear after crashing driver even when doing it intentionally, game was started up after driver crash not before or it would crash game as well.


----------



## Jazz107 (Jul 25, 2021)

Kabouter Plop said:


> Timespy has crashed at stock to maybe bit less common not sure but it was first thing i noticed in first month anyway if i am not alone with these issues then i am somewhat relieved.



I've had two 6900 XT, a reference and Nitro+, and both could crash in TimeSpy GT2 even at stock


----------



## Butanding1987 (Jul 26, 2021)

turbogear said:


> There are some things for real daredevils.
> Like hardware modes with Elmor EVC2SX to raise GPU voltage or soldering capacitors to clean out voltages which gives higher frequencies.
> 
> For me I think the real potential left is with 5900X and memory tuning.
> ...


Why do you assume that @Felix123BU and I aren’t “real daredevils”? Would you believe me if I told you that I soldered an EVC2 to my 6800 XT?


----------



## turbogear (Jul 26, 2021)

Butanding1987 said:


> Why do you assume that @Felix123BU and I aren’t “real daredevils”? Would you believe me if I told you that I soldered an EVC2 to my 6800 XT?


You my friend is a real daredevils. 

I am going to flash my 6900XTU with the XTXH-LC bios for which one needs external programmer. 

This thing is on its way from Amazon:





						KeeYees SOIC8 SOP8 Test Clip für EEPROM 25CXX / 24CXX + CH341A 24 25 Serie EEPROM Flash BIOS USB Programmer: Amazon.de: Gewerbe, Industrie & Wissenschaft
					

KeeYees SOIC8 SOP8 Test Clip für EEPROM 25CXX / 24CXX + CH341A 24 25 Serie EEPROM Flash BIOS USB Programmer: Amazon.de: Gewerbe, Industrie & Wissenschaft



					www.amazon.de
				




The XTXH-LC bios does not brick the Liquid Devil, but cannot be flashed by Windows and Linux tools due to having different ID.

I will be ordering EVC2SX myself.  
I was looking for it yesterday at amazon but I think one need to order it directly from Elmorlabs.



Garlic said:


> I made my vram voltage higher (to 900mv/1400mv) and I found out that by giving the vram more voltage it somehow makes me have a higher core clock.
> 
> That aside is there anything I can do in this part of mpt? Thanks.
> View attachment 209761



Yes for me as well higher VRAM voltage gave higher VRAM clock.
I have not tried this part of MPT until now.
Reading at Hardwareluxx and Igoreslab forums, some people tried to play around things like Fclk and FclkBoostFreq but they got only a very little FPS gains.



turbogear said:


> You my friend is a real daredevils.
> 
> I am going to flash my 6900XTU with the XTXH-LC bios for which one needs external programmer.
> 
> ...


The bios programmer has already arrived.
Now need to get it all setup on my laptop before attempting to flash XTXH-LC bios onto 6900XTU. 

EVC2SX has been order.
I think it will take few weeks for it to arrive from Taiwan to Germany.

I am also working on CTR2.1.
The initial settings are just stupidly unstable. I got now P1 and P2 working after manual tuning. I need to fine tune further as Cinebench sometimes crashes with error message with these settings though in built stability test in CTR ran through without issues.
Multi-threaded score was around 8960.
With curve optimizer I got max 8700.


----------



## Butanding1987 (Jul 27, 2021)

turbogear said:


> The bios programmer has already arrived.
> Now need to get it all setup on my laptop before attempting to flash XTXH-LC bios onto 6900XTU.
> 
> I am also working on CTR2.1.
> ...


Keep us posted.


----------



## jesdals (Jul 27, 2021)

Reference cards and other might not only benefit from adding thermal pads to the backplate - for a while I have been using a 90mm fan blowing down on the backside of the GPU mount - using a 120mm placed near the front of the card with the fans above the GPU blowing down gave another 2 degress celcius on memory temps - so ad a fan and keep it cool


----------



## Felix123BU (Jul 27, 2021)

jesdals said:


> Reference cards and other might not only benefit from adding thermal pads to the backplate - for a while I have been using a 90mm fan blowing down on the backside of the GPU mount - using a 120mm placed near the front of the card with the fans above the GPU blowing down gave another 2 degress celcius on memory temps - so ad a fan and keep it cool


Yeah, generally correct, though I must say, the 6800XT reference has a really decent stock cooler, I tried adding some pads on the back, could not notice any meaningful difference on vram or power circuitry vs without extra pads on the back (with good internal airflow). I was a bit weary when I got it, as AMD reference coolers where always utter garbage, but for the 6000 series they are surprisingly good   

But as you say, a combo of extra pads on the back and a fan blowing on it would shave a couple of degrees off.


----------



## Kabouter Plop (Jul 27, 2021)

my gpu temp dropped like 4-6c when i aplied missing thermal pad that had like 2-3mm space between block and  contact points i dont think there was any risk of shorting but added to be sure, tightnend it a bit more to, hotspot seems unchanged so far
Only place i got pads on the back is memory and core like ekwb suggested maybe could've added few extra pads at vrm spots as well on the back.


----------



## Liviu Cojocaru (Jul 27, 2021)

hey guys, I have an issue on my MSI Gaming Trio X 6900XT ... it seems that the junction temp is climbing very high up to 110-114C if I increase the power limit to the max.
I have undervolted it to 1070mv on the gpu and have the power limit to default (0) and I get the gpu frequency in games between 2450-2520 and the mem is OC to 2088mhz. 
The junction temp in this case goes up to around 90-100C

I saw a review pointing out that there seems to be somekind of issue with this specific cooler and how it makes contact with the chip. I am quite dissapointed as I have also saw that this might be one of the best pcb and vrm on a 6900xt...so great for OC...and it is useless in this instance

Any ideas? I was thinking about replacing this cooler altogether but there are not a lot of options


----------



## Felix123BU (Jul 27, 2021)

Liviu Cojocaru said:


> hey guys, I have an issue on my MSI Gaming Trio X 6900XT ... it seems that the junction temp is climbing very high up to 110-114C if I increase the power limit to the max.
> I have undervolted it to 1070mv on the gpu and have the power limit to default (0) and I get the gpu frequency in games between 2450-2520 and the mem is OC to 2088mhz.
> The junction temp in this case goes up to around 90-100C
> 
> ...


What's the difference between GPU temp and Junction temp? If its higher than 25c, the mounting pressure is bad, if its in that range, it can be either also the mounting pressure, paste applied wrongly, or even bad ventilation inside the case causing pockets of hot air to be trapped inside and overheating the card.
MSI don't have the best track record with GPU's lately, what you could do is try to tighten the 4 screws above the chip a bit, but that can be tricky because at least one is usually covered by a warranty sticker, and tightening to much could damage the chip, though you need to apply a lot of force and usually the mounting mechanism does prevent too much force.
Other than that, replacing paste, which also means opening the cards. Best option is a waterblock, but those add extra cost, even more so if you don't have the other components for a water loop.


----------



## Liviu Cojocaru (Jul 27, 2021)

Felix123BU said:


> What's the difference between GPU temp and Junction temp? If its higher than 25c, the mounting pressure is bad, if its in that range, it can be either also the mounting pressure, paste applied wrongly, or even bad ventilation inside the case causing pockets of hot air to be trapped inside and overheating the card.
> MSI don't have the best track record with GPU's lately, what you could do is try to tighten the 4 screws above the chip a bit, but that can be tricky because at least one is usually covered by a warranty sticker, and tightening to much could damage the chip, though you need to apply a lot of force and usually the mounting mechanism does prevent too much force.
> Other than that, replacing paste, which also means opening the cards. Best option is a waterblock, but those add extra cost, even more so if you don't have the other components for a water loop.


Yes, the difference in temps is about 30 C. I don't have the means and the time to set up watercooling so I can't add the waterblock...I don't want to loose the warranty either.
I will try and tighten up the other 3 screws and see how it goes.


----------



## Butanding1987 (Jul 27, 2021)

Liviu Cojocaru said:


> I will try and tighten up the other 3 screws and see how it goes.


Don’t tighten the screws at all if you can’t tighten all four. It might just get worse.


----------



## turbogear (Jul 28, 2021)

Liviu Cojocaru said:


> hey guys, I have an issue on my MSI Gaming Trio X 6900XT ... it seems that the junction temp is climbing very high up to 110-114C if I increase the power limit to the max.
> I have undervolted it to 1070mv on the gpu and have the power limit to default (0) and I get the gpu frequency in games between 2450-2520 and the mem is OC to 2088mhz.
> The junction temp in this case goes up to around 90-100C
> 
> ...



I have also heard in some forums some people having issues with cooling their 6900XT but not only related to MSI. Even some reports from owners of factory fitted water blocks.
My 6900XTU Liquid Devil Ultimate which comes factory fitted with EK water block was also hitting 93-95°C although I have pretty good water cooling loop.
For me the solution was to open my card and replace standard thermal past with Liquid Metal and my max hotspot went to below 78°C.  

At what fan speeds did you test this?
Using defualt fan profile or higher speed?
I had a few 6900XT fan cooled in my hands in the past. 
If you increase powerlimit then temperature goes quickly to higher than 90°C.
What always helped was custom fan profile where fan already running at relatively high speed even at low temperatures and also please  turn off Zero RPM feature in the drivers otherwise fan stops every time it has opportunity to do so thus temperatures are worst when you are Ocing the card.

Unfortunately, from my experience 6900XTs produce a lot of heat when you increase power limit with MPT and therefore not easy to cool without waterblock. 
People who very seriously want to squeeze the best OC performance out of their 6900XT eventually end up by changing to water block.  
One need but to test the card first. Some cards don't OC well at all. In that case water block would be an overkill.


----------



## Liviu Cojocaru (Jul 28, 2021)

turbogear said:


> I have also heard in some forums some people having issues with cooling their 6900XT but not only related to MSI. Even some reports from owners of factory fitted water blocks.
> My 6900XTU Liquid Devil Ultimate which comes factory fitted with EK water block was also hitting 93-95°C although I have pretty good water cooling loop.
> For me the solution was to open my card and replace standard thermal past with Liquid Metal and my max hotspot went to below 78°C.
> 
> ...


Thanks for the reply. I am using a custom fan curve in msi afterburner. It seems that with any power limit increase temps are going to 100C+ with fan speed set to go up to 80%. 
I will probably leave it as it is now to 0 power limit as this seems to keep the temps in the 90's . Will probably buy an AIO for it when one will be available


----------



## turbogear (Jul 28, 2021)

Liviu Cojocaru said:


> Thanks for the reply. I am using a custom fan curve in msi afterburner. It seems that with any power limit increase temps are going to 100C+ with fan speed set to go up to 80%.
> I will probably leave it as it is now to 0 power limit as this seems to keep the temps in the 90's . Will probably buy an AIO for it when one will be available


On 6900XT Red Devil, I was using 95% fan profile to keep temperatures in range of 95°C on hotspot. 



Butanding1987 said:


> Keep us posted.


So here we go with first results with new 21.7.2 drive and the XTXH-LC bios flashed on my Liquid Devil Ultimate.
Memory is running at 2310MHz with Fast Time 2.

I have finally crossed the 24K Time Spy mark.  
There is still some room for improvement.

I saw power consumption up to 460W on Core during the Time Spy Graphic test 2. 

There is also Fire Strike Ultra score. I am now over 16K. 

There  are people in Hardwareluxx forum who are getting score close to 24800. 
The new drive does also some magic.
I need to tune further to get into that region.

I did further Curve optimization for 5900X in bios and Cinebench score went from ~8620 to ~8835 but it did not show much effect on my GPU score in Time Spy.
It is still in same range like before. 

I have also changed the pads from Alphacool (3w/mK) Ultra Soft to Alphacool Rise Ultra Soft (7w/mK).
I see that my memory temperatures during Time Spy run went down from ~54°C previously to ~48°C.








I changed the pads. I am still using Liquid Metal and to protect the small exposed components around the Die, I MX-5 filling this area.


----------



## Kabouter Plop (Jul 29, 2021)

Hopefully wow pink screen bug is going on known issues list next driver or fixed list


----------



## Valantar (Jul 29, 2021)

Kabouter Plop said:


> Hopefully wow pink screen bug is going on known issues list next driver or fixed list


Just to be the one asking the stupidly obvious question: have you reported it through the bug report tool?


----------



## Kabouter Plop (Jul 29, 2021)

Valantar said:


> Just to be the one asking the stupidly obvious question: have you reported it through the bug report tool?



If done many times they tend to ignore bug reports via tool, like i can easily reproduce a driver crash 99% of the time just opening live view on my synology surveilance station in firefox and they ignored that bug report for since i got my gpu, they responded on reddit driver thread about pink issue so they gonna look into it, its easy to reproduce so i doubt they ignore it once they try reproduce it unless they have no clue and try reproduce under wrong conditions.


----------



## Valantar (Jul 29, 2021)

Kabouter Plop said:


> If done many times they tend to ignore bug reports via tool, like i can easily reproduce a driver crash 99% of the time just opening live view on my synology surveilance station in firefox and they ignored that bug report for since i got my gpu, they responded on reddit driver thread about pink issue so they gonna look into it, its easy to reproduce so i doubt they ignore it once they try reproduce it unless they have no clue and try reproduce under wrong conditions.


I doubt they ignore them, but no doubt they have a huge pile of reports to work through, and even sorting through them and assigning the work is likely a huge undertaking. I'd expect everything to get worked on sooner or later though. That driver crash on opening the Synology live view sounds like it could just as easily be an issue with Synology's implementation of that web interface though - it might not be something AMD is able to fix. Depends on what exactly is causing the crash and how. Does it crash in other browsers as well?


----------



## Kabouter Plop (Jul 29, 2021)

They should hire some more people also to read feedback on forums outside reddit and their own forums, cos that's where you find most feedback.


----------



## Felix123BU (Jul 29, 2021)

Kabouter Plop said:


> They should hire some more people also to read feedback on forums outside reddit and their own forums, cos that's where you find most feedback.


Not sure that would be feasible, not trying to defend AMD, but they made a report tool that theoretically gives all required data back to them for analysis, its just that people are not used to reporting them.
I reported a "bug" through the tool, namely the on screen metrics loosing their size settings and reverting to 50% scaling, to my surprise, 2 weeks later in the new driver that was flagged and reported as being solved, did not really expect that. On the other hand, 4 drivers later, that same issue is back, reported again, lets see if the fix it again


----------



## dragospetre88 (Jul 29, 2021)

Guys, I have a major problem and maybe you could help me.

After I had my rx 6900 xt red devil limited repasted a couple of months ago, junction temp went back up and I decided to ask someone to replace the thermal pads too (the same guy who repasted). It was the only solution I could think of.

Now, initially, he used 2 mm pads from thermal grizzly (https://ipon.ro/shop/produs/thermal-grizzly-minus-pad-8-30x30x2mm/1156574) and 3 mm (https://www.itdirect.ro/pad-uri-termoconductoare/thermal-grizzly/minus-pad-8-20x-120x-3-0-mm/). When firing up games, temp would reach 110 degrees in the menu screen and pc would reboot.

The second time, he got rid of the 3 mm ones and only used 2 mm for everything. Now I can actually play games (max 1440p resolution), but I need to limit the fps in amd overdrive and keep power usage roughly around 160W in order not to overheat... Used both msi afterburner and amd overlay to monitor temps.

Obviously I'm a noob and cannot explain you better. My question is: did anyone here replaced his/her thermal pads on RX 6900 XT Red devil limited?

Any help would be much appreciated.

Thanks!


----------



## Law67 (Jul 29, 2021)

dragospetre88 said:


> Guys, I have a major problem and maybe you could help me.
> 
> After I had my rx 6900 xt red devil limited repasted a couple of months ago, junction temp went back up and I decided to ask someone to replace the thermal pads too (the same guy who repasted). It was the only solution I could think of.
> 
> ...


The red devil uses 1.5mm thermal pads for Ram, VRM phase etc, changing thermal pads is never a good idea if you don't know the correct size, also the stock pads that Powercolor use are very soft and compress well, even with some 1.5mm thermal pad if they don't compress as well as the original ones you could still run into problems with the cold plate not making good contact with the die, i had the same problems with you with my Red Devil as regards to heat problems, i could not even add any PL or it would throttle at 110c, i took my Red devil apart put clear nail varnish over the semi conductors around the GPU and added Liquid metal to the die and cooler, this dropped the temps by 20c and can now run 400w though my card at about 96c, i did not change the thermal pads i used the original ones, you could always use thermal putty instead of pads as putty will compress fine


----------



## Felix123BU (Jul 29, 2021)

dragospetre88 said:


> Guys, I have a major problem and maybe you could help me.
> 
> After I had my rx 6900 xt red devil limited repasted a couple of months ago, junction temp went back up and I decided to ask someone to replace the thermal pads too (the same guy who repasted). It was the only solution I could think of.
> 
> ...


Your issue is basically bad mounting of the cooler, namely bad contact between the die and the coldplate. That can be for a number of reasons, not enough pressure through the 4 screws over the die, too much pressure from the pads being a wrong size and not allowing enough pressure on the die, possibly incorrect thermal paste application. As said above, the pads can be to stiff and to thick aswell.

The suggestion would be to re-use the original pads, if still available, if not, get some correct thickness soft pads, Alphacool has some ENG_1021762_Alphacool_Rise_Ultra_Soft_thermal_pad_7Wmk_100x100x1mm_Datasheet.pdf

Also, as suggested above, Liquid Metal on the die can make a huge difference, but you or the one who would apply it must really know what they are doing, or else disaster can follow incorrect application.


----------



## dragospetre88 (Jul 29, 2021)

Law67 said:


> The red devil uses 1.5mm thermal pads for Ram, VRM phase etc, changing thermal pads is never a good idea if you don't know the correct size, also the stock pads that Powercolor use are very soft and compress well, even with some 1.5mm thermal pad if they don't compress as well as the original ones you could still run into problems with the cold plate not making good contact with the die, i had the same problems with you with my Red Devil as regards to heat problems, i could not even add any PL or it would throttle at 110c, i took my Red devil apart put clear nail varnish over the semi conductors around the GPU and added Liquid metal to the die and cooler, this dropped the temps by 20c and can now run 400w though my card at about 96c, i did not change the thermal pads i used the original ones, you could always use thermal putty instead of pads as putty will compress fine


Thanks for the advice. Can you please recommend me the exact pads i could use (with link so I can order)?


----------



## Law67 (Jul 29, 2021)

dragospetre88 said:


> Thanks for the advice. Can you please recommend me the exact pads i could use (with link so I can order)?


To tell you the exact pads to use is not easy without knowing how well they compress compared to the original ones, however i have put 2 links below one to a set of what i belive are soft 1.5mm thermal pads, the other link is to thermal putty which is what i would use if i had lost the original thermal pad as with thermal putty you should not get a bad mount as a result of incorrect thermal pads which is just one factor in a obtaining a good and successful mount









						EC360® BRONZE 8W/mK Thermal pad
					

Quantity 2+ 4+ 8+ 16+ 32+ 50+ 100+ 200+ 500+ Discount 6% 12% 15% 21% 26% 31% 35% 38% 40% The EC360® BRONZE series presents the beginners variant of high-performance thermal pads, which are on eye-level with premium thermal pastes. The pads have a high thermal conductivity of 8 W/mK and are...




					www.coolsierra.com
				









						K5-PRO
					

K5-PRO Thermal Paste




					www.computer-systems.gr


----------



## dragospetre88 (Jul 29, 2021)

Law67 said:


> To tell you the exact pads to use is not easy without knowing how well they compress compared to the original ones, however i have put 2 links below one to a set of what i belive are soft 1.5mm thermal pads, the other link is to thermal putty which is what i would use if i had lost the original thermal pad as with thermal putty you should not get a bad mount as a result of incorrect thermal pads which is just one factor in a obtaining a good and successful mount
> 
> 
> 
> ...


Thanks again. One more question: this thermal putty is applied just like any regular thermal Paste?


----------



## GerKNG (Jul 29, 2021)

first couple timespy runs with my 6700XT (why is it not even in the GPU database?, XFX Qick 319 Black Edition)
i am K0NG__


----------



## Law67 (Jul 29, 2021)

dragospetre88 said:


> Thanks again. One more question: this thermal putty is applied just like any regular thermal Paste?


Yes but its thicker than thermal paste however you can still use a Thermal paste spatula to put in the pot and plop it on each ram chip etc or even use your fingers


----------



## turbogear (Jul 29, 2021)

Law67 said:


> To tell you the exact pads to use is not easy without knowing how well they compress compared to the original ones, however i have put 2 links below one to a set of what i belive are soft 1.5mm thermal pads, the other link is to thermal putty which is what i would use if i had lost the original thermal pad as with thermal putty you should not get a bad mount as a result of incorrect thermal pads which is just one factor in a obtaining a good and successful mount
> 
> 
> 
> ...


I am not too sure if the Red Devil uses 1.5mm pads. 
The 6900XT Liquid Devil uses 1mm pads under the EK Water block that is factory fitted.

I can recommend the Alphacool Rise Ultra soft (7 w/mK), but these come in 0.5mm, 1mm, 2mm and 3mm only. I have not seen 1.5mm version of these.








						Alphacool Rise Ultra Soft Wärmeleitpad 7W/mk 50x50x2mm
					

Die Rise Ultra Soft sind Alphacool‘s Performance orientierte Lösung im Bereich der Wärmeleitpads, da man sie aufgrund ihrer herausragenden Eigenschaften fast mit einer Wärmeleitpaste vergleichen kann. Die Rise Ultra Soft Wärmeleitpads...




					www.aquatuning.de
				




Actually these are comparable to the original Liquid Devil pad from PowerColor. These are very soft  almost like a thermal past and have only 20 Shore OO as hardness.
As show in my previous post, I am using these now on my Liquid Devil.

The first time I opened my card to apply Liquid Metal, some of the original pads got destroyed when I took the cooler apart. Therefore it is always good to have some backup pads when opening the cooler.  
Sometime they are partially stuck with the component and partial with heatsink so part will break apart when opening the cooler.

At that time Alphacool did not offer the Rise Ultra Soft. They offered only Alphacool Ultra Soft (3 w/mK) which also has same hardness but less than half Thermal conductivity of the Rise Ultra Soft.
With Rise I have almost 4-5°C better VRAM temperatures.

Note I have flashed a special bios onto my Liquid Devil. It applies more voltage to VRAM thus producing more heat which add a few degrees extra of heat to the VRAM.
That is why I wanted to test the Rise Ultra Soft and I see this difference in temperatures now. 



GerKNG said:


> first couple timespy runs with my 6700XT (why is it not even in the GPU database?, XFX Qick 319 Black Edition)
> i am K0NG__
> View attachment 210364



That is the magic of the 21.7.2 driver.
People on Hardwareluxx has hit 25K Time Spy score on 6900XT. 









						[Sammelthread] - Offizieller  AMD  [[ RX6700 // RX 6700XT // 6750XT // X6800 // 6800XT // 6900XT // 6950XT]] Overclocking und Modding Thread  [[ Wakü - Lukü - LN2]]
					

Bislang nur gehört das TS etreme auch paar Prozent zugelegt haben soll.  Über die letzten Treiber ging pr bei mir aber auch leicht hoch, denke das wird diesmal auch wieder klappen, aber Wunder zu erwarten sind da denke ich nicht^^




					www.hardwareluxx.de
				




I need to try to tune further to go in the direction of 25K.
As my previous post from yesterday, I am at 24.2K at the moment, but there is more room for improvement.
I need to find a good setting for my 5900X and RAM to go closer to 25K.
Time Spy seems to like Oced CPU and Memory.

Either I need to get CTR2.1 stable or do manual OC in the bios. 

My problem is the time at the moment and I have too less of it for tuning. 
You know summer vacations started over here and kids need all the free time I have.


----------



## Kabouter Plop (Jul 29, 2021)

Timespy still crashes with no result produced sometimes, but scored even more now like its crazy if went from 18k to 19k now whats next 20k ?


----------



## Law67 (Jul 29, 2021)

turbogear said:


> I am not too sure if the Red Devil uses 1.5mm pads.
> The 6900XT Liquid Devil uses 1mm pads under the EK Water block that is factory fitted.
> 
> I can recommend the Alphacool Rise Ultra soft (7 w/mK), but these come in 0.5mm, 1mm, 2mm and 3mm only. I have not seen 1.5mm version of these.
> ...


I measured the original thermal pads on my Red Devil with my digital calipers @1.5mm when i took it part to to apply the LM, its the same board on the Powercolor RX 6800 XT as the Powercolor Rx 6900 XT only difference being the RX 6800 XT does not have a third 8-Pin molex, i don't know about the Liquid Devil as i have never stripped one


----------



## turbogear (Jul 29, 2021)

Kabouter Plop said:


> Timespy still crashes with no result produced sometimes, but scored even more now like its crazy if went from 18k to 19k now whats next 20k ?


One of the users at Hardwareluxx made a graph showing the Time Spy improvement with time since launch of RDNA2.
This time AMD has some strange magic ongoing and they are still finding lot of Performance every new drive launch. 









						[Sammelthread] - Offizieller  AMD  [[ RX6700 // RX 6700XT // 6750XT // X6800 // 6800XT // 6900XT // 6950XT]] Overclocking und Modding Thread  [[ Wakü - Lukü - LN2]]
					

Von den Temparaturen träume ich , Morgens um 06.00h sind hier 25C   Optimal wenns regnet und dannmorgens viel Tau liegt.  Hatte jetzt einen Run mit Flackern und 22118 Score. Da scheint irgendwas einzubrechen.




					www.hardwareluxx.de
				




Fine wine principle seems to be working for them. 
I lost the track where is actually RTX 3090 now day with performance.   Did Nvidia also find some nice performance also or only AMD?



Law67 said:


> I measured the original thermal pads on my Red Devil with my digital calipers @1.5mm when i took it part to to apply the LM, its the same board on the Powercolor RX 6800 XT as the Powercolor Rx 6900 XT only difference being the RX 6800 XT does not have a third 8-Pin molex, i don't know about the Liquid Devil as i have never stripped one


Okay.
It could be there is a difference between original PowerColor cooler and Liquid Devil cooler. The Liquid Devil's water block is made by EK for PowerColor.
EK almost always use 1mm pads for the VRAM and VRM on it's blocks. That could be the reason why Liquid Devil is using 1mm and the other cooler from PowerColor using 1.5mm.
Advantage with 1mm over 1.5mm distance is you have less thermal resistance due to only needing a thinner pad but it also mean more strict tolerances control is needed in the manufacturing process than where you have 1.5mm distance between the cooler and component.
Less tolerance allowed means higher manufacturing price.  



Kabouter Plop said:


> Timespy still crashes with no result produced sometimes, but scored even more now like its crazy if went from 18k to 19k now whats next 20k ?


Strange. 
Usually I have Time Spy crash only if I have some unstable setting. 
This is weird that you have this phenomena on default settings. 

It could be that your card is trying to pull more power than is allowed by the default setting and thus results in crash. 
I had sometimes some issues like that but on OCed card and there I was able to stabilize it by applying more power with MTP but that I think should not be happening on default settings.


----------



## Kabouter Plop (Jul 29, 2021)

I usually force radeon sharpening globally and set fps limit globally to 400 least, 236 once my samsung g9 returns hopefully new one soon, old one been thru 4 dead pixels 1 fixed it self 2 fully dead green pixels + 1 new half dead pixel appeared i decided if had enough of this monitor so demanded to exchange it now they wanna repair it which not possible you cant fix dead pixels unless you think smashing screen is fixing it but that just makes it worse.


----------



## Felix123BU (Jul 29, 2021)

Kabouter Plop said:


> Timespy still crashes with no result produced sometimes, but scored even more now like its crazy if went from 18k to 19k now whats next 20k ?


I am at 20657 with my 6800XT   
AMD Radeon RX 6800 XT video card benchmark result - AMD Ryzen 7 5800X,Gigabyte Technology Co., Ltd. B550 AORUS PRO (3dmark.com)



Kabouter Plop said:


> I usually force radeon sharpening globally and set fps limit globally to 400 least, 236 once my samsung g9 returns hopefully new one soon, old one been thru 4 dead pixels 1 fixed it self 2 fully dead green pixels + 1 new half dead pixel appeared i decided if had enough of this monitor so demanded to exchange it now they wanna repair it which not possible you cant fix dead pixels unless you think smashing screen is fixing it but that just makes it worse.


Should never crash at pure stock settings, meaning no MPT, no OC, pure default settings, but once you do MPT and OC it gets unstable fast, though to find working OC settings


----------



## turbogear (Jul 29, 2021)

The top 10 spots on Time Spy are still dominated by 3090.  

But I think lot of people from Hardwareluxx did not report their scores yet.
These Enthusiast are still tuning their cards like 24/7 without sleep  since yesterday and I think soon there will more 25K range scores from 6900XT on this chart. 






Felix123BU said:


> I am at 20657 with my 6800XT
> AMD Radeon RX 6800 XT video card benchmark result - AMD Ryzen 7 5800X,Gigabyte Technology Co., Ltd. B550 AORUS PRO (3dmark.com)
> 
> 
> Should never crash at pure stock settings, meaning no MPT, no OC, pure default settings, but once you do MPT and OC it gets unstable fast, though to find working OC settings


You mean 22 891 GPU score.   
20657 GPU score would have been boring at this point.


----------



## Felix123BU (Jul 29, 2021)

turbogear said:


> The top spot on Time Spy are still dominated by 3090.
> 
> But I think lot of people from Hardwareluxx did not report their scores yet.
> These Enthusiast are still tuning their cards like 24/7 without sleep  since yesterday and I think soon there will more 25K range scores from 6900XT on this chart.
> ...


Top 4 at least of those are done on Liquid Nitrogen, on water they are similar to a 6900XT, I shiver at the thought of watts one needs to run a 3090 at  2.5ghz  



turbogear said:


> The top 10 spots on Time Spy are still dominated by 3090.
> 
> But I think lot of people from Hardwareluxx did not report their scores yet.
> These Enthusiast are still tuning their cards like 24/7 without sleep  since yesterday and I think soon there will more 25K range scores from 6900XT on this chart.
> ...


Yup, thought @Kabouter Plop meant 19k overall,  and yes, from first Gpu score Time Spy attempt of 18.5k (oc-ed) to almost 23k is about the amount of driver improvements over time (plus some CPU oc), much more than I would have expected. Does not all translate to extra real game performance, but that also grew nicely since day one.


----------



## turbogear (Jul 29, 2021)

Felix123BU said:


> Top 4 at least of those are done on Liquid Nitrogen, on water they are similar to a 6900XT, I shiver at the thought of watts one needs to run a 3090 at  2.5ghz


People at Hardwareluxx are also becoming very creative.
If you go and read that forum and see temperature reports from Time Spy runs. 
Ideas like bucket of Ice for submersing the radiator seem to be common.   

Some even use devices like Chiller:








						Alphacool Eiszeit 2000 Chiller - Black
					

Die Alphacool Eiszeit ist ein Kompressor Kühler mit einer maximalen Kühlleistung von 1500W.  Anders als die meisten sogenannten Chiller oder Durchlaufkühler bietet die Eiszeit eine leistungsstarke integrierte Pumpe.   Highlights...




					www.aquatuning.de


----------



## Felix123BU (Jul 29, 2021)

turbogear said:


> People at Hardwareluxx are also becoming very creative.
> If you go and read that forum and see temperature reports from Time Spy runs.
> Ideas like bucket of Ice for submersing the radiator seem to be common.
> 
> ...


I see, taking a trick out of Intel handbook of gaining extra performance....Intel 5 Ghz 28 core 

Best take on that: 28 Cores of Bulls#!t - Intel's "5GHz" Parlor Trick - YouTube


----------



## jesdals (Jul 30, 2021)

Hmm these standard fan curve settings in the AMD drivers seems stupid - changeing the fancurve to a manual setting like this



Made my hotspot temps go to a record low during an our of gameing



Both memory max and GPU hot spot is more than 8c lower than the default profile. It seems stupid that its not a low noise option instead of a limiting factor while gameing


----------



## Kabouter Plop (Jul 30, 2021)

Pink screen in wow can be fixed by not using in game msaa together with renderscale as well, honestly don't think if ever used both at same time before in past.
playing around with pink screen








Playing around with ray traced shadows








Anything above fair will be glitchy


----------



## kiddagoat (Aug 3, 2021)

Since getting my 6900XT figured out, I am really enjoying it now.  I have been using it to do some AI Upscaling with DVDFab with my anime DVD collection.  It is a bit faster than my 2080 Super and can't complain about the quality.  

I am able to keep it stable with a min clock of 2600Mhz and a max of 2700Mhz.


----------



## Valantar (Aug 3, 2021)

Spent a bit of time exploring undervolting and underclocking my 6900 XT Liquid Devil Ultimate today. There's some interesting behaviour to be found here, especially around what seems to be a pretty hard voltage/clock scaling step, where the chip requires a significant boost in voltage to be stable at all.

At stock, my GPU is a bit limited by cooling (single slim 280mm rad for the whole system), but runs ~2550MHz when not thermally limited (i.e. before the liquid reaches steady state) at stock, i.e. 330W and I assume 1200mV (that's what Wattman defaults to, at least), with the default range of 500-2614MHz. (It steps down to ~2.4GHz when it heats up, maintaining 330W GPU power.) Undervolting at stock is essentially a no-go - even 1175mV isn't Time Spy stable.* And that applies when stepping down to quite a lot lower clocks - even 2100/2300 isn't stable at 1175mV.
My stock Time Spy scores are ~18900 total/21500 graphics. (That's with a 5800X.)





So I decided to start at the other end: rather than starting at stock and stepping downwards as I go, I chose an arbitrary low clock and voltage setting to see if it was stable - I picked 1600/1800 1000mV. And it was. So was 975mV, and 950mV. For now I haven't tested any lower. I didn't see much power scaling between 1V and 950mV - both reported ~150-160W GPU power draw during Time Spy. Performance was pretty crappy though, at 15500/16700.





So I decided to start stepping clocks back upwards, seeing how far I could go at 950mV. 1700/1900? Check. 1800/2000? Check. 1900/2100? Nope. Not at 975 or 1000mV either. But 1850/2050 at 950mV is TS stable, and scores ~17000/18600, all while consuming about 180W, +/- 10W.





One sort of interesting thing (at least to me): whatever I set my max clock to, it seems to sustain ~60MHz less than that, rock solid. 1800=1740, 1900=1840, 2000=1940, 2050=1990.

*To summarize: for a ~11%/14% performance loss (total/GPU scores), I've cut power draw by ~42%* (assuming sustained 330W/190W draws). That's a win in my book. At the very least I've got a very attractive-looking low power draw profile for games that don't need all the power available from this GPU.

I'll make some tests to see if anything lower than 950mV is feasible, as well as exploring if there are any intermediate stable settings between that range and 1.2V. But for now, I'm pretty happy with the results of about an hour's work.


*note (and this applies to all my testing): I run dual monitors, and as I'm looking for something suitable for daily use I'm testing with that, which does seem to hurt stability somewhat).

Edit: I've been using Aquasuite to monitor my power and thermals, and it seems their "GPU Core" power reading is what AMD calls "GPU Power", which seems to be what HWinfo calls either GPU ASIC power or TGP Power. Either way, it should be full-board power, not just the GPU itself.

Edit2: UV/UC is completely stable from playing a couple of hours of Metro Exodus, and performance is not noticeably changed. Power draw in-game fluctuates between 110-130W in outdoor areas to 160-170W in indoor areas. And my GPU is happily chugging along at ~45°C rather than 60-ish, and the CPU is much cooler too as the water in the loop is much cooler. _Very_ happy with these results.


----------



## Butanding1987 (Aug 4, 2021)

You shouldn't have bought a 6900 XT Liquid Devil if you're trying to save power. Underclocking is forbidden on  this forum  — just ask @turbogear, who's now looking at blasting his Liquid Devil Ultimate with all the voltage that it needs.


----------



## turbogear (Aug 4, 2021)

Butanding1987 said:


> You shouldn't have bought a 6900 XT Liquid Devil if you're trying to save power. Underclocking is forbidden on  this forum  — just ask @turbogear, who's now looking at blasting his Liquid Devil Ultimate with all the voltage that it needs.


You are right. 
Underclocking is nothing for me. For me that would mean Liquid Devil would be like having Ferrari and driving on 3 cylinders with top speed limited to 100km/h.


----------



## Valantar (Aug 4, 2021)

Butanding1987 said:


> You shouldn't have bought a 6900 XT Liquid Devil if you're trying to save power. Underclocking is forbidden on  this forum  — just ask @turbogear, who's now looking at blasting his Liquid Devil Ultimate with all the voltage that it needs.





turbogear said:


> You are right.
> Underclocking is nothing for me. For me that would mean Liquid Devil would be like having Ferrari and driving on 3 cylinders with top speed limited to 100km/h.


Weeeeeell .... what counts as pushing things to the extreme is highly context dependent, after all. IMO, my setup goes further than theirs  @turbogear has a 127-liter case with 6x120mm (likely thicker than mine) radiators. I have a 14.7-liter case with 2x140mm 30mm radiators. So their radiator-to-case-volume ratio isn't particularly impressive, and you could fit a theoretical 8.7 of my systems inside of theirs (though in reality it's likely more like 4-5 - the measurements sadly don't quite match up  ). Still, the LDU is rather extreme for my setup even at stock - though the case and cooling do keep up admirably IMO. To me, maxing out efficiency is just as interesting as maxing out performance. Or even more, really - brute-forcing higher performance isn't particularly fun, and there aren't many of us with the time and expertise to do proper high end overclocking. I'll stick to watching Buildzoid do his thing 

And lastly:


----------



## Butanding1987 (Aug 4, 2021)

Valantar said:


> I'll stick to watching Buildzoid do his thing


Buildzoid will roast you for owning a Biostar motherboard.


----------



## Valantar (Aug 4, 2021)

Butanding1987 said:


> Buildzoid will roast you for owning a Biostar motherboard.


Ah, I see I haven't updated my system specs. That one lives in my NAS now. In my defense, it was the only AM4 ITX motherboard in existence at the time. FWIW, it was pretty crap (rather unstable, refused to run RAM above 2933, had those weird 1st-gen Ryzen random shutdowns, for some reason my NVMe SSD ran crazy hot), but it's been serving me well for the past few months in my NAS (39 days of uptime and counting). Even handles ECC UDIMMs nicely.


----------



## Kabouter Plop (Aug 5, 2021)

Probably little bit off-topic but my 2e samsung g9 arived, even bigger downgrade with 15+ dead pixels, how people even manage to get new screens these days with 0 issues


----------



## GerKNG (Aug 5, 2021)

anyone with a 6700XT that can not run fast timings at all?
i have two 6900XT both perform the best at ~ 2100 Fast.
my 6700XT Qick 319 Black (highest end XFX bin) can not run ANY memory clock with fast settings. but full 2150 without. (Fast settings = very hard lockups, bluescreens. for example: grey screen, a few blue dots --> reboot. game locks up with an extremely pixelated screen (in new world and PUBG within minutes) ending in a hard crash - reboot.)


----------



## Garlic (Aug 5, 2021)

GerKNG said:


> anyone with a 6700XT that can not run fast timings at all?
> i have two 6900XT both perform the best at ~ 2100 Fast.
> my 6700XT Qick 319 Black (highest end XFX bin) can not run ANY memory clock with fast settings. but full 2150 without. (Fast settings = very hard lockups, bluescreens. for example: grey screen, a few blue dots --> reboot. game locks up with an extremely pixelated screen (in new world and PUBG within minutes) ending in a hard crash - reboot.)


My 6700xt nitro can run at fast timings even at 2150, it's just that at that point it performs worse than stock. Increasing the voltage via mpt does not fix the problem for me and I just run 2150 with normal timings. (Also should't the highest bin be the Merc 319?)


----------



## turbogear (Aug 5, 2021)

GerKNG said:


> anyone with a 6700XT that can not run fast timings at all?
> i have two 6900XT both perform the best at ~ 2100 Fast.
> my 6700XT Qick 319 Black (highest end XFX bin) can not run ANY memory clock with fast settings. but full 2150 without. (Fast settings = very hard lockups, bluescreens. for example: grey screen, a few blue dots --> reboot. game locks up with an extremely pixelated screen (in new world and PUBG within minutes) ending in a hard crash - reboot.)


I had issues before with my 6900XTU Liquid Devil Ultimate. It could not run memory OC at all independent of Fast Timing or not.
To my surprise this is somehow limitations depending on the bios settings that we cannot influence.

I have moved since some time to new XTXH-LC bios and now my card memory runs at 2310MHz at Fast Time level 2. 
I had to flash this with a special programmer. 
I got some help from Hardwareluxx members.




Here is my latest Time Spy Score:


It is at place 5 in overall score for 6900XT and 5900X combination.  
This was impossible with original bios where my memory would not OC at all.

I am still trying to find some more points. 
The guys above me in this list mostly have even better cooling solutions.
For example I know Holzmann from Hardwareluxx.
He put some air conditioning next to his Mo-Ra (https://shop.watercool.de/MO-RA3-420-LT-black_1) cooling radiator to reach that score. 
I don't own anything like that and don't plan to do so.


----------



## GerKNG (Aug 6, 2021)

Garlic said:


> My 6700xt nitro can run at fast timings even at 2150, it's just that at that point it performs worse than stock. Increasing the voltage via mpt does not fix the problem for me and I just run 2150 with normal timings. (Also should't the highest bin be the Merc 319?)


well i can run the memory at 2150 with no issues (it even scales properly)
but fast timings (even at stock) results in extremely hard crashes. (barely any load = (for example) grey screen with blue circles on it, black screen, reboot)
the GPU runs at around 2800-2830 Mhz (2860Mhz 1150mv in Wattman)


----------



## Garlic (Aug 6, 2021)

GerKNG said:


> well i can run the memory at 2150 with no issues (it even scales properly)
> but fast timings (even at stock) results in extremely hard crashes. (barely any load = (for example) grey screen with blue circles on it, black screen, reboot)
> the GPU runs at around 2800-2830 Mhz (2860Mhz 1150mv in Wattman)


Hmm, interesting, maybe some mem chips just doesn't like fast timings. Mine does do fast 2150 but there artifacts and stuff, not to mention that it losses performance at that clock.  Also whats your temps like with that oc? My 6700xt when oced and running furmark can reach 100 junction and 280 watts power draw


----------



## turbogear (Aug 6, 2021)

Garlic said:


> Hmm, interesting, maybe some mem chips just doesn't like fast timings. Mine does do fast 2150 but there artifacts and stuff, not to mention that it losses performance at that clock.  Also whats your temps like with that oc? My 6700xt when oced and running furmark can reach 100 junction and 280 watts power draw


Many RDNA2 cards don't seem to like VRAM OCed much higher than 2100MHz.
The indication for the point where memory OC is not good any higher you go is that you will see performance drop for any frequency above that point.

My 6800XT could OC VRAM only to 2110MHz at Fast Timing and 6900XTU could not memory OC at all but as I mentioned in previous post it seems that some setting choice made by AMD is limiting this.

The proof of that seems to be the new AMD XTXH-LC card.








						AMD launches Radeon RX 6900 XT Liquid Edition with 330W TBP and 18Gbps memory - VideoCardz.com
					

AMD launches Radeon RX 6900 Liquid Cooled with even more performance AMD has unveiled its flagship GPU from Radeon RX 6000 series, the RX 6900 XT LC. A hybrid cooling solution attached to an external 120mm radiator with a single fan are not the only changes in comparison to the vanilla RX 6900...




					videocardz.com
				




People found out that it is basically having same memory chips but different settings.
This card shares reference design with other XTXH cards and people found out that flashing bios from this reference 6900XT XTXH card to other custom XTXH cards works and unlocks higher memory clocks.

Regarding hotspot temperatures, my Liquid Devil also approached 95°C on water before. 
I changed standard Thermal past with Liquid Metal to get temperatures below 80°C on hotspot.


----------



## HD64G (Aug 6, 2021)

Valantar said:


> Spent a bit of time exploring undervolting and underclocking my 6900 XT Liquid Devil Ultimate today. There's some interesting behaviour to be found here, especially around what seems to be a pretty hard voltage/clock scaling step, where the chip requires a significant boost in voltage to be stable at all.
> 
> At stock, my GPU is a bit limited by cooling (single slim 280mm rad for the whole system), but runs ~2550MHz when not thermally limited (i.e. before the liquid reaches steady state) at stock, i.e. 330W and I assume 1200mV (that's what Wattman defaults to, at least), with the default range of 500-2614MHz. (It steps down to ~2.4GHz when it heats up, maintaining 330W GPU power.) Undervolting at stock is essentially a no-go - even 1175mV isn't Time Spy stable.* And that applies when stepping down to quite a lot lower clocks - even 2100/2300 isn't stable at 1175mV.
> My stock Time Spy scores are ~18900 total/21500 graphics. (That's with a 5800X.)
> ...


In your shoes I would load the UV profile for all games apart from the ones that need max power and the benchmarks. Much less life-consuming workload for your mighty GPU that could last at least 4 years on.


----------



## Butanding1987 (Aug 6, 2021)

turbogear said:


> Many RDNA2 cards don't seem to like VRAM OCed much higher than 2100MHz.
> The indication for the point where memory OC is not good any higher you go is that you will see performance drop for any frequency above that point.
> 
> My 6800XT could OC VRAM only to 2110MHz at Fast Timing and 6900XTU could not memory OC at all but as I mentioned in previous post it seems that some setting choice made by AMD is limiting this.
> ...


I want to flash that AMD XTXH BIOS on my 6800 XT..


----------



## GamerGuy (Aug 6, 2021)

Eh, wasn't there a post from someone about me starting this thread as a bragging thread (or something to that effect)? I could have sworn I'd saw it on my phone while I was out an about. I was gonna be cool about it, and say that he/she is entitled to his/her opinion....got on my desktop when I got back and 'poof!' that post vanished (did I imagine it?), regardless, I'd lost an opportunity to play the  card, danggit!


----------



## the54thvoid (Aug 6, 2021)

Post was deleted. No drama required.


----------



## GamerGuy (Aug 6, 2021)

the54thvoid said:


> Post was deleted. No drama required.


Good to see mods are doing a standout job here , been to many forums where mods don't seem to bother, and let threads devolved into chaos with flames hurled all over before they'd act.


----------



## turbogear (Aug 6, 2021)

GamerGuy said:


> Good to see mods are doing a standout job here , been to many forums where mods don't seem to bother, and let threads devolved into chaos with flames hurled all over before they'd act.


Yes, that is nice thing about TPU that admins are very vigilant. 
I saw that post and was afraid that a flame war will start here messing up our thread and I reported it. 



Butanding1987 said:


> I want to flash that AMD XTXH BIOS on my 6800 XT..


You can try it but I think it will brick your card.
It is not compatible with 6800XT.

The nice thing about ch341a programmer is if your card is bricked you can flash it again with correct bios. 
Other thing to mention, when flashing the bios with programmer the card should be disconnected from computer and lying on your desk. 
The power to the bios chip will come from the programmer.


----------



## Valantar (Aug 6, 2021)

HD64G said:


> In your shoes I would load the UV profile for all games apart from the ones that need max power and the benchmarks. Much less life-consuming workload for your mighty GPU that could last at least 4 years on.


I've been using it full time since I set it, and I'm planning to keep it on  At least until I upgrade my main monitor - this profile is plenty for 1440p60. Metro Exodus played great with the profile enabled and all settings maxed out (including Ultra RT + RT reflections) - it did dip below 60fps at times, Radeon Software logged an overall average fps of 59.7, with 95th percentile frametimes of 17ms, so it's been lower than 60 for a bit, but in a game like that it's nothing I can say I actually noticed (even with Vsync on!). I did see some minor instability (one crash just after starting the game, and some in-game 2-3-second freezes), so I stepped up to 1V, and they seemed to go away with no increase in power draw. 

Finished ME yesterday, tried my hand at some Jedi - Fallen Order (working my way into the "The Fury X couldn't handle this" part of my backlog), which was locked at 60 (again, all settings maxed) with the GPU even clocking down a bit further, around 1800MHz, so it was clearly not 100% busy. And I'm not one of those "ultra settings or death!" placebo addicts - even with a GPU like this I'd be perfectly fine with tuning some settings for better performance (which will likely be everyday reality if I end up moving to an UHD monitor). Ultra settings are generally not perceptibly different from High anyhow.

My plan going forward is to keep the profile enabled for anything where I don't _really_ need more performance. In those (likely to be few) cases I'll happily let my system get toasty - it can handle it.

Oh, and with this UC/UV profile I've been seeing ~45°C GPU core temps and +~10° hotspot temps, compared to ~60°C core and +~25° hotspot. Not all that happy with the latter, makes me wonder if the card could do with a refit, but ... meh. It's fine, especially with the UV. Even with hotspot temps close to 90 in worst case scenarios that's still 20° below anything dangerous. And that's in the middle of summer with 30°C ambient temps, so most of the year should be much better. As you say, this will at best extend the lifespan of the card, at worst do nothing at all except allow me a cooler and more pleasant system, and I can always dial things back to stock should I want to. Heck, with my QDCs I could even hook up my spare 240mm rad externally if I _really_ needed that extra performance (or wanted to try my hand at OCing).


----------



## GerKNG (Aug 6, 2021)

Garlic said:


> Hmm, interesting, maybe some mem chips just doesn't like fast timings. Mine does do fast 2150 but there artifacts and stuff, not to mention that it losses performance at that clock.  Also whats your temps like with that oc? My 6700xt when oced and running furmark can reach 100 junction and 280 watts power draw


around 60°C edge, 78-85°C Hotspot (full load, max powerlimit of 242W) and the memory sits in the low 60s at peak.


----------



## Garlic (Aug 7, 2021)

GerKNG said:


> around 60°C edge, 78-85°C Hotspot (full load, max powerlimit of 242W) and the memory sits in the low 60s at peak.


I'm starting to think my card has bad thermal paste application, under full load at 280w the edge is in the 50s and the hotspot being 100+. Perhaps I'll repaste it sometime


----------



## turbogear (Aug 7, 2021)

Butanding1987 said:


> I want to flash that AMD XTXH BIOS on my 6800 XT..


I see you buying 6900XT XTXH card and flashing it with XTXH-LC bios and beating my score by 2000 points.  



Garlic said:


> I'm starting to think my card has bad thermal paste application, under full load at 280w the edge is in the 50s and the hotspot being 100+. Perhaps I'll repaste it sometime


If the difference between edge and hotspot is 50°C then there seems to be like an issue with cooler having proper contact with the GPU. 
If you have warranty, maybe you can ask the shop where you bought it to RMA it and maybe you get a replacement.


----------



## Garlic (Aug 7, 2021)

turbogear said:


> I see you buying 6900XT XTXH card and flashing it with XTXH-LC bios and beating my score by 2000 points.
> 
> 
> If the difference between edge and hotspot is 50°C then there seems to be like an issue with cooler having proper contact with the GPU.
> If you have warranty, maybe you can ask the shop where you bought it to RMA it and maybe you get a replacement.


I ran a set of tests again, when gaming the dege is around 55 and hotspot around 85, and when running furmark edge around 50 and hotspots 50 degrees higher, so the edge temp is a bit..strange?
I asked the store I got it from, they said its running in spec "Below 110C"  Guess I'll have to find another way.


----------



## Valantar (Aug 7, 2021)

Garlic said:


> I ran a set of tests again, when gaming the dege is around 55 and hotspot around 85, and when running furmark edge around 50 and hotspots 50 degrees higher, so the edge temp is a bit..strange?
> I asked the store I got it from, they said its running in spec "Below 110C"  Guess I'll have to find another way.


That could be a coldplate flatness issue, a mounting pressure issue, a thermal paste issue, a bad contact due to too thick thermal pads issue, and a few other explanations. Different workloads will also create different deltas between edge and hotspot temps, particularly ultra-hot workloads like Furmark. In general, Furmark has the potential to kill GPUs (exactly because of how it generates some insane hotspots), so ... don't use it. It's extremely unrealistic in so many ways. Pretty much any highly demanding benchmark is a better solution for stress testing.

Still, a repaste and remount sounds like it would be in order, just to rule out those issues. As long as it won't void your warranty, that is.


----------



## Butanding1987 (Aug 7, 2021)

turbogear said:


> I had issues before with my 6900XTU Liquid Devil Ultimate. It could not run memory OC at all independent of Fast Timing or not.
> To my surprise this is somehow limitations depending on the bios settings that we cannot influence.
> 
> I have moved since some time to new XTXH-LC bios and now my card memory runs at 2310MHz at Fast Time level 2.
> ...


That is a very good score. Good job. And that VRAM overclock is something else.
That 5900x is holding you back. You need a 5950x. LOL.
You should be able to raise your CPU score by just doing a manual straight overclock — simple and easy. No need for PBO. Try 4700 all cores @~1.338v (VID, not TEL). You should get at least 16,000 CPU points in Time Spy. This is just for 3dmark so it should be safe.

I tried the Wild Life and Wild Life Extreme benchmarks just for fun.   








I ran Time Spy with a moderate GPU overclock using the 5950x and breached 22k for overall score. But my score is invalid because of some “time measurement inconsistencies”. I ran it twice and both my scores were invalid.


----------



## turbogear (Aug 7, 2021)

Butanding1987 said:


> That is a very good score. Good job. And that VRAM overclock is something else.
> That 5900x is holding you back. You need a 5950x. LOL.
> You should be able to raise your CPU score by just doing a manual straight overclock — simple and easy. No need for PBO. Try 4700 all cores @~1.338v (VID, not TEL). You should get at least 16,000 CPU points in Time Spy. This is just for 3dmark so it should be safe.


I am still not at end of tuning my 6900XT.
The OC community at Hardwareluxx is providing some tips what one can do in MPT.
I have not explored all options that are offered in the latest Beta 6 version of MPT.
When I get more time I will do so. 

I was myself playing with the thoughts of going to 5950X. 
But on the other hand I think AMD will release at end of the year a new processor with this 3D cache that they say improves 15% gaming performance. 

I tried Dynamic OC switcher feature from ASUS Crosshair VIII Dark Hero which does Manuel OC on all cores.
Unfortunately, my 5900X does not like OC up to 4.7GHz. I tried Core VID up to 1.35v.
You can select different OC frequency on CCD0 and CCD1 on Dark Hero in Dynamic OC Switcher.
My CCD1 is weaker than CCD0.
I tried CCD0 of 4.6GHz and CCD1 of 4.5GHz with Core VID up to 1.35v.
I got nice blue screens with WHEA errors even before I can start Prime95 test. 
If I put both CCD's at 4.6GHz then PC does not boot at all on similar VID.  
With PBO and Curve optimization the CPU runs up to 4.575GHz-4.6GHz at all cores in Cinebench with a score of ~8960.

How high can I go on VID?

How I improved my CPU score from 12500 in Time Spy to ~13500 was by just changing the RAM from G.Skill F4-3600C14-8GTZNB to G.Skill F4-3800C14-16GTZN.
This is Dual Rank memory with higher speed. The old one was Single Rank.
Dual Rank provides speed advantages.
I want to try to OC it to higher performance with tightening the timing like Igor shows in the link below but I need to put some cooling on it.









						Does the new high-end RAM for Ryzen 5000 live up to its promise? - G.SKILL DDR4-3800 CL14 2x 16GB kit put through its paces | Page 3 | igor'sLAB
					

The Trident Z RGB and Neo series from G.SKILL have been around almost as long as DDR4. However, for the new Ryzen 5000 CPUs, G.SKILL has refreshed their SKUs and added new XMP variants specifically…




					www.igorslab.de
				




Temps are going higher than 54°C already at XMP when I run TestMem5. 
There is not much airflow around the RAM due to water cooling setup for CPU. 
I will try to mount some 120mm fan in the area to cool both RAM and mainboard VRM.


----------



## Garlic (Aug 7, 2021)

Valantar said:


> That could be a coldplate flatness issue, a mounting pressure issue, a thermal paste issue, a bad contact due to too thick thermal pads issue, and a few other explanations. Different workloads will also create different deltas between edge and hotspot temps, particularly ultra-hot workloads like Furmark. In general, Furmark has the potential to kill GPUs (exactly because of how it generates some insane hotspots), so ... don't use it. It's extremely unrealistic in so many ways. Pretty much any highly demanding benchmark is a better solution for stress testing.
> 
> Still, a repaste and remount sounds like it would be in order, just to rule out those issues. As long as it won't void your warranty, that is.


Alright, guess I'll repaste it sometime. However there's two stickers on the screws. Perhaps I could try remove it without damaging it? Anyways thanks for the help.


----------



## Valantar (Aug 7, 2021)

Garlic said:


> Alright, guess I'll repaste it sometime. However there's two stickers on the screws. Perhaps I could try remove it without damaging it? Anyways thanks for the help.


Where are you located? If you're in the US, those stickers are meaningless. Elsewhere it's a bit of a crapshoot whether they affect your warranty or not.


----------



## turbogear (Aug 8, 2021)

Garlic said:


> Alright, guess I'll repaste it sometime. However there's two stickers on the screws. Perhaps I could try remove it without damaging it? Anyways thanks for the help.


It's not easy to remove those stickers without damaging them.

In Germany the warranty is void if one removes the stickers, but there is solution to buy replacement stickers from Aliexpress.  
You can find almost every brand warranty stickers there.
I bought some by myself. 
Shipping time from China could be long. I think I received mine almost 3 weeks after order.



			https://m.de.aliexpress.com/item/1005002063221667.html?spm=a2g0n.productlist.0.0.6ad361f2VD21d2&browser_id=63d275f6859048d7b9107db2770720e0&aff_trace_key=e9dbc6fb9c2945fe99cd0edfd867d407-1621197532413-08282-UneMJZVf&aff_platform=msite&m_page_id=nmnhgfmutmccasxj17b23bcf67a1bfdf797214a1ec&gclid=&_imgsrc_=ae01.alicdn.com%2Fkf%2FHd8059a16da38490d97dfcd0e654dceccs.jpg_640x640Q90.jpg
		


The thing that is also possible to break when opening the GPU are the pads thermal.
You should try to save these. The original ones are needed if you want to save warranty.

Other question:
Is yours a reference cards?
If yes then it s not easy to reply thermal past.
The reference cards don't use past but a thermal pad. 
If you apply thermal past instead then it might not work at all, because there could remain a gap. 
The reference cooler is not designed for having thermal past and therefore the gap is larger and one need to tighten the screws very hard to get then a contact if one uses thermal past.
By the way that is a special high thermal conductive pad that is not easy to find a replacement for.


----------



## Garlic (Aug 8, 2021)

turbogear said:


> It's not easy to remove those stickers without damaging them.
> 
> In Germany the warranty is void if one removes the stickers, but there is solution to buy replacement stickers from Aliexpress.
> You can find almost every brand warranty stickers there.
> ...


Guess I’ll get some of those stickers, I don’t have the amd reference design (Thank god) instead I have the sapphire nitro so I guess paste should work fine.




Valantar said:


> Where are you located? If you're in the US, those stickers are meaningless. Elsewhere it's a bit of a crapshoot whether they affect your warranty or not.


I’m in New Zealand, no idea if it’s legal here, these sticker are probably one of the worst features on a gpu


----------



## Butanding1987 (Aug 8, 2021)

turbogear said:


> I am still not at end of tuning my 6900XT.
> The OC community at Hardwareluxx is providing some tips what one can do in MPT.
> I have not explored all options that are offered in the latest Beta 6 version of MPT.
> When I get more time I will do so.
> ...


That sucks. Have you tried overclocking per CCD? My 5900x can run at 4.725GHz (CCD1) and 4.525GGHz (CCD2) at 1.25v (TEL) or 1.271v (VID).

I think we have the same type of RAM. Igor's settings are actually a bit loose. I can run mine at CL14-8-14-14-28 at 3800MHz, rock stable. I can run it at 3933MHz but with WHEA errors though I have yet to encounter BSODs so that's how I run it.

Samsung b-die is sensitive to heat. I encountered errors when temps hit 46 degrees Celsius, so I decided to watercool my two sticks (2x16Gb). I also tried putting a fan, which limited temps to 44-45 degrees but it still wasn't enough. So I had to put the sticks under water.

Take a look at my CTR 2.1 settings, maybe you can compare your settings with mine. I'm using Hydra now, which is a lot more stable. I love my 5900x but the 5950x is an entirely different type of beast.


----------



## Valantar (Aug 8, 2021)

Butanding1987 said:


> That sucks. Have you tried overclocking per CCD? My 5900x can run at 4.725GHz (CCD1) and 4.525GGHz (CCD2) at 1.25v (TEL) or 1.271v (VID).
> 
> I think we have the same type of RAM. Igor's settings are actually a bit loose. I can run mine at CL14-8-14-14-28 at 3800MHz, rock stable. I can run it at 3933MHz but with WHEA errors though I have yet to encounter BSODs so that's how I run it.
> 
> ...


A bit OT, but where is Hydra available? Is it only through 1usmus' patreon?


----------



## Butanding1987 (Aug 8, 2021)

Valantar said:


> A bit OT, but where is Hydra available? Is it only through 1usmus' patreon?


Yes, it's supposed to be available only on Patreon. But if you look around, you'll see it on a Chinese website (leaked).


----------



## turbogear (Aug 9, 2021)

Butanding1987 said:


> That sucks. Have you tried overclocking per CCD? My 5900x can run at 4.725GHz (CCD1) and 4.525GGHz (CCD2) at 1.25v (TEL) or 1.271v (VID).
> 
> I think we have the same type of RAM. Igor's settings are actually a bit loose. I can run mine at CL14-8-14-14-28 at 3800MHz, rock stable. I can run it at 3933MHz but with WHEA errors though I have yet to encounter BSODs so that's how I run it.
> 
> ...


With regards to RAM it seem that we have same thoughts.  
I have this new RAM since a week now and was already looking for water cooling solution before you mentioned it.  
Which waterblock do you use for you DDR?

I tried per CCD ocing using my motherboard Dark Hero built in Dynamic OC Switcher with CCD0 of 4.6GHz and CCD1 of 4.5GHz with Core VID from 1.3 up to 1.35v.
It was crashing.

The CTR2.1 version that I find publicly seem to miss some of the features that you show on your screen shot.
Is that version that you get if you have Patreon?


			https://www.patreon.com/posts/ctr-2-1-rc6-ver-53600047
		


I would like to try Hydra as well if I can get it but seems like only available to Patreon supporters. 

My problem with CTR2.1 version that I have is that it never completes Diagnose fully.
When running Diagnose, PC crashes and after restart it continues but does not find meaningful stable settings.
I tried manually but without good base from CTR and did not had too much success as I missed a good base settings from auto Diagnostic.

I tried your setting only doing change to PX profile as 5025MHz was unstable.
I notice that I was using lower voltage at P1 and P2.
Your setting seem to work one processor except for PX profile.
I did not spend too much time to check stability though. 

Similar settings on OC Switcher results in immediate crash.
It seems CTR here does a better job.  
I still don't understand why CTR is not able to run successful Diagnostic on my system. 
The values it suggested were too high for the voltages it applied. I tried to stay around that suggested values and it did not work.
Now using your setting as base, I will try to see what mine can reach. 




The results that I get on Time Spy CPU score though are similar to what I get with PBO + Curve Optimizer only without manual OC.
With your settings as well as with PBO + Curve Optimizer, the score is in the same range of 14300-14550 varying between the runs.

I think you get higher than 15000 because you have tuned your memory further.
Memory seems to bring a good boost. If you remember my CPU score was in range of 12500 with older 3600 CL14 Single Rank memory.
Going to 3800 CL14 Dual Rank brought about 2000 points more.

My new memory is still running at XMP profile.
I did not get the time to tune it further since I bought it and I need to get some better cooling solution for the RAM; maybe water cooling.  

CPU score with your settings without PBO or Curve Optimizer.


CPU score in one of the runs with PBO + Curve Optimizer without CTR and without manual OCing the CPU:


As I said, it varies between runs. The above was one of the best CPU scores but it was not the best GPU run.
Nice would be if both will get top performance in same run. 
Maybe I need to go to Safe Mode so that some other background processes does not steal my CPU performance.  

I noticed that Aquasuite X software running in the background was causing me about 200 points in Time Spy CPU score. 
When running it collect water loop, CPU, GPU sensor data and send it to Aquacomputer Vision and thus steal some processor resources.
The both above runs are with Aquasuite X service stopped.


----------



## Butanding1987 (Aug 10, 2021)

Make sure your BIOS settings are correct for CTR (global C-states and CCPC enabled, CO disabled, etc.). Your LLC is also important. For Asus the recommendation is 3. CPU voltage should be default.

Run Cinebench to check for vdroop. Safe voltage for P1 is 1250mv and 1325mv for P2. Your TEL will be lower because of vdroop, that’s why I added 21mv and 13mv for P1 and P2, respectively. Your Cinebench run will tell you how much more you need (check CTR window for the reading).

Disable PX profile in the mean time because it’s the hardest to tweak. Or you can try 4.95GHz at 1.425v for PX high, 4.875GHz at 1.4v for mid and 4.775GHz at 1.375v for low.

I’m using a RAM block from Barrow. About RAM, yes, 3dmark favors fast RAM.

Here are my RAM settings. You can also use TRFC=244 if it boots.





P.S.
I sent you a PM.


----------



## turbogear (Aug 10, 2021)

Butanding1987 said:


> Make sure your BIOS settings are correct for CTR (global C-states and CCPC enabled, CO disabled, etc.). Your LLC is also important. For Asus the recommendation is 3. CPU voltage should be default.
> 
> Run Cinebench to check for vdroop. Safe voltage for P1 is 1250mv and 1325mv for P2. Your TEL will be lower because of vdroop, that’s why I added 21mv and 13mv for P1 and P2, respectively. Your Cinebench run will tell you how much more you need (check CTR window for the reading).
> 
> ...


Thanks a lot for info.
For bios settings I used suggestions from Yuri Bubli in the guide video.


			https://www.youtube.com/36990d23-e2a9-4e73-98ee-0483c7e3aadd
		


For CTR2.1, he was only changing VRM (LLC) settings in the bios. Therefore I left CCPC, global C-states, etc on defualt.
I used to change these too on previous versions of CTR.
I will change these and try.

Thanks for the hint regarding vdroop.
I am still in learning phase when it comes to CPU and VRAM OCing. 
In the past my focus was mostly to OC GPU to get best gaming performance but did not care much about Time Spy global ranking.


----------



## Kabouter Plop (Aug 10, 2021)

Another new driver another timespy boost this time ?
Getting my third Samsung G9 oddyssey this week last 2 had dead pixels.


----------



## dragospetre88 (Aug 10, 2021)

Got in-depth details on the pads that come from fabric om red devil 6900 xt limited (7 w/mk are used). The response is straight from powercolor:
l   Hardness: 20 shore 00

l   MEM Pad Size:10*12*1.65t (mm)

l   Left Side MOS Pad Size:7*64*1.5t (mm)

l   Right Side MOS Pad Size:7*105*1.5t (mm)


----------



## Felix123BU (Aug 10, 2021)

Kabouter Plop said:


> Another new driver another timespy boost this time ?
> Getting my third Samsung G9 oddyssey this week last 2 had dead pixels.


Wish you luck, must be super frustrating to pay a ton of money and receive faulty products  

If I would still not be hopeful of oled-like tech making its way into monitors, that G9 would be a strong threat to my wallet


----------



## Kabouter Plop (Aug 10, 2021)

Felix123BU said:


> Wish you luck, must be super frustrating to pay a ton of money and receive faulty products
> 
> If I would still not be hopeful of oled-like tech making its way into monitors, that G9 would be a strong threat to my wallet



May receive one tommorow but i feel like they received a entire batch of faulty products that where never meant to be sold as new first 3 dead pixels then 15+ im not gonna accept anything less then perfect especially after first one that had 3 dead pixels then 1 recovered and new one died 2 weeks later, followed by 2e panel of 15+ dead pixels yeah no thanks, im gonna be cautious for 100 days straight if this one is perfect cos i have 100 day return period.
edit: arives in 1-2 hours now
I received my third panel with 2 dead pixels near top edge of the screen 1 faint and another 1 clearly dead, i keep winning the lottery for dead pixels


----------



## droopyRO (Aug 13, 2021)

Dose RX 6600 XT support Smart Access Memory ?


----------



## Felix123BU (Aug 13, 2021)

droopyRO said:


> Dose RX 6600 XT support Smart Access Memory ?


If your motherboard does, it should


----------



## Garlic (Aug 13, 2021)

droopyRO said:


> Dose RX 6600 XT support Smart Access Memory ?


Yes it does, just don't buy one. Trash value for what it is.


----------



## GamerGuy (Aug 13, 2021)

Just saw the locally distributed Sapphire cards in my neck of the woods: 
Pulse RX 6600 XT ~ 515USD, 
Nitro+ RX 6600 XT ~ 540USD. 

These prices are local currency converted to USD...not great, but not as bad as I'd thought it'd be.


----------



## Felix123BU (Aug 13, 2021)

GamerGuy said:


> Just saw the locally distributed Sapphire cards in my neck of the woods:
> Pulse RX 6600 XT ~ 515USD,
> Nitro+ RX 6600 XT ~ 540USD.
> 
> These prices are local currency converted to USD...not great, but not as bad as I'd thought it'd be.


Well, you cant overinflate an already overinflated price much more, those cards should have been 250-300 USD, AMD did the pre-scalping themselves   
But yeah, these seem to be available everywhere at close to MSRP.

In my my neck of the woods all cards are available from both AMD and Nvidia, except that they are all twice as expensive as they should be and nobody is buying them, the most numerous are the 3080TI models, I guess people here said heeeeell no, thank you, no more 

It has been a crazy year, and the fact that I got my 6800XT at launch for MSRP is just now becoming a small wonder  Best GPU I ever had, still love it!


----------



## Butanding1987 (Aug 13, 2021)

I'm baaaack.


----------



## Felix123BU (Aug 13, 2021)

Butanding1987 said:


> I'm baaaack.
> 
> View attachment 212429


 Never doubted it.

New driver any good for improvements in this?

Didn't you have a 5900X? I see a 5950X there


----------



## droopyRO (Aug 13, 2021)

Felix123BU said:


> If your motherboard does, it should


I had to rollback the BIOS from F35 to F33 and then re-upgrade it to F35 to get SAM to work. Mobo is X570 Elite and GPU si RX 6600 XT. I had previously tried everything from reinstalling drivers both GPU and chipset, different PCIE slots, reset BIOS. Sometimes troubleshooting these things makes no sense when you find the solution.


Garlic said:


> Yes it does, just don't buy one. Trash value for what it is.


Maybe where you live. But i got mine for less money than a GTX 1660 or RX 5500 that have insane prices.


----------



## turbogear (Aug 13, 2021)

Butanding1987 said:


> I'm baaaack.
> 
> View attachment 212429


Welcome back with 1st place on 5950X.  
Seems like you have given up on 5900X and went to big brother to get higher than 22k on overall score. 

You will not see any scores from me in the coming weeks. 
I am going on vacation until mid of September starting from end of  next week.

By the way,  I heard rumors on Hardwareluxx that AMD will launch refresh of 6900XT in October. The rumors say similar performance with less power consumption and a little improved RTS performance.
The guy who said this is usually spot on with his info in the past.
Seems like he has some connections to AMD. 









						[Sammelthread] - Offizieller  AMD  [[ RX6700 // RX 6700XT // 6750XT // X6800 // 6800XT // 6900XT // 6950XT]] Overclocking und Modding Thread  [[ Wakü - Lukü - LN2]]
					

Jo, Ich finds erfrischend, das ein kleiner 5600x mit der "richtigen" Graka dicke reicht. -->XTXH  Und das Ganze mit nem normalen 750W-NT in nem vernünftigen Gehäuse. Der 5600x braucht halt net so ne große AiO onTop und Platz für ne TOXIC kann ja beim Gehäuse vorgesehen werden. Push+Pull sollte...




					www.hardwareluxx.de


----------



## Butanding1987 (Aug 13, 2021)

Felix123BU said:


> Never doubted it.
> 
> New driver any good for improvements in this?
> 
> Didn't you have a 5900X? I see a 5950X there


Thank you for your support.  I don’t think it was the driver. More like the cold from the air-con.And of course, the lovely 5950x. I came upon an offer I couldn’t resist so I bit it.



turbogear said:


> Welcome back with 1st place on 5950X.
> Seems like you have given up on 5900X and went to big brother to get higher than 22k on overall score.
> 
> You will not see any scores from me in the coming weeks.
> ...


Enjoy your vacation. You can lend me your 6900 XT while you’re out. I’m tired of the 6800 XT. 

This VikTOR guy is very competitive. I'm not sure how far I'm willing to go.


----------



## Felix123BU (Aug 14, 2021)

Butanding1987 said:


> Thank you for your support.  I don’t think it was the driver. More like the cold from the air-con.And of course, the lovely 5950x. I came upon an offer I couldn’t resist so I bit it.
> 
> 
> Enjoy your vacation. You can lend me your 6900 XT while you’re out. I’m tired of the 6800 XT.
> ...


You shall smoke that VikTOR   I have faith in you 
Just please don't also smoke your card, would be a pity, you have a diamond sample 6800XT, collectors edition


----------



## Butanding1987 (Aug 14, 2021)

Felix123BU said:


> You shall smoke that VikTOR   I have faith in you
> Just please don't also smoke your card, would be a pity, you have a diamond sample 6800XT, collectors edition


Thanks. If you only knew what this card has gone through.


----------



## Nargon33 (Aug 16, 2021)

What were to happen if I used the Red BIOS Editor to edit my new 6600xt BIOS and upload it to my card? It has a dual BIOS switch since it's an XFX Merc model. Card is not officially supported for the editor, or the flashing tool Igor lists on instructions, but it allows me to load my reference BIOS in to RBE, and load an MPT file as well.

I know going beyond 1.15v with writing to the "SPPT" file causes it lock to 500mhz since AMD locked out people from overvolting at the end of 2020 (I think it was 300mhz on RDAN1). But I thought I read in the instructions that writing to BIOS is a workaround. Is this true? Could I try 1.18v or even 1.20v on my 6600xt? I'm tempted to try since I have dual BIOS, but I'd like to know if someone already tried and it's pointless. I think the flashing utility might not work yet on the 6600xt, but I'm not sure. There is another official flashing utility I can find, but it needs a signed BIOS, which a custom BIOS is not.


----------



## turbogear (Aug 16, 2021)

Nargon33 said:


> What were to happen if I used the Red BIOS Editor to edit my new 6600xt BIOS and upload it to my card? It has a dual BIOS switch since it's an XFX Merc model. Card is not officially supported for the editor, or the flashing tool Igor lists on instructions, but it allows me to load my reference BIOS in to RBE, and load an MPT file as well.
> 
> I know going beyond 1.15v with writing to the "SPPT" file causes it lock to 500mhz since AMD locked out people from overvolting at the end of 2020 (I think it was 300mhz on RDAN1). But I thought I read in the instructions that writing to BIOS is a workaround. Is this true? Could I try 1.18v or even 1.20v on my 6600xt? I'm tempted to try since I have dual BIOS, but I'd like to know if someone already tried and it's pointless. I think the flashing utility might not work yet on the 6600xt, but I'm not sure. There is another official flashing utility I can find, but it needs a signed BIOS, which a custom BIOS is not.


I am not sure but it could very well be that the bios will be bricked.
In that case you will then need a programmer like CH341A to flash back original bios.
As far as remember, a few months ago many experts at Igor lab forum were looking into moding RDNA2 bioses but at that time they were not successful.
The moded bios was bricking the cards.

I can try such crazy things as I own the programmer and can recovery my card's bios afterwards.


----------



## Nargon33 (Aug 16, 2021)

turbogear said:


> I am not sure but it could very well be that the bios will be bricked.
> In that case you will then need a programmer like CH341A to flash back original bios.
> As far as remember, a few months ago many experts at Igor lab forum were looking into moding RDNA2 bioses but at that time they were not successful.
> The moded bios was bricking the cards.
> ...


I just read on Igor's Lab forums  that someone tried it with a 6700xt and it bricked their card. I heard all you really have to do is start the PC with the integrated CPU graphics, or with it switched to the 2nd BIOS, then switch back to the broken BIOS and flash a working original BIOS onto it to reverse it.

They left some code in there accidently that makes you think it's going to work for the 6900xt cards when you try it. Also curious what happen if I tried flashing like an official and signed proper 6700xt BIOS onto my 6600xt. Would the extra cache, VRAM, and compute units confuse the crap out of it? Or would it by some miracle work? I don't know what other hidden voltages there are that might actually cause damage.


----------



## turbogear (Aug 16, 2021)

Nargon33 said:


> I just read on Igor's Lab forums  that someone tried it with a 6700xt and it bricked their card. I heard all you really have to do is start the PC with the integrated CPU graphics, or with it switched to the 2nd BIOS, then switch back to the broken BIOS and flash a working original BIOS onto it to reverse it.
> 
> They left some code in there accidently that makes you think it's going to work for the 6900xt cards when you try it. Also curious what happen if I tried flashing like an official and signed proper 6700xt BIOS onto my 6600xt. Would the extra cache, VRAM, and compute units confuse the crap out of it? Or would it by some miracle work? I don't know what other hidden voltages there are that might actually cause damage.


Yes what you said about flashing back original bios could work but it does not always especially if you use a moded bios.
I read on some forums about users who flashed 6800XT with 6900XT bios. The 6800XT was bricked.
In some cases where moded bios was used, the flash back of original 6800XT bios without a programmer was not possible.

I don't believe you will be able to flash 6600xt with 6700xt bios without bricking it.
Even people tried to flash regular 6900XT with 6900XT XTXH version bios. Both are supposed to be same card but one is binned higher. This did not work.  The regular 6900XT was bricked.


----------



## Felix123BU (Aug 17, 2021)

Yay, finally found a patch for my itch, namely a way to use FSR in Cyberpunk   

Guess, which one is native and which one is FSR


----------



## HD64G (Aug 17, 2021)

Nice! Both seem the same to me. Could that be on the left with the FSR sharpening some textures?


----------



## Kabouter Plop (Aug 17, 2021)

Cyberpunk got FSR support ?


----------



## Felix123BU (Aug 17, 2021)

Kabouter Plop said:


> Cyberpunk got FSR support ?


Nope, sadly no, and not even in the upcoming 1.3 patch  
Found an upscaling app on Steam (lossless scaling I think its called) that can upscale any game, and it recently got FSR support, so I tested it on Cyberpunk, and I am really amazed by how good it looks.
Its a bit tricky to use, since in order to get the exact FSR quality levels you have to set up custom resolutions, and it only works in windowed mode, which is an issue if you have a HDR monitor, HDR and windowed mode don't go well together.

But its proof of concept that FSR and Cyberpunk would be fantastic together, if only CDPR would not be weird about it.



HD64G said:


> Nice! Both seem the same to me. Could that be on the left with the FSR sharpening some textures?


Yup, left is FSR   Though with the tool I used you can adjust the sharpening filter.
Got from 42 FPS with Ultra and RT Reflections on to 58 FPS, with the equivalent of Ultimate Quality, upscaled from 2648x1108 to 3440x1440
In game while moving there is no difference vs native, you have to stop and look for pixels to see any difference.


----------



## Kabouter Plop (Aug 17, 2021)

Honestly pisses me off gamedevs that play favorites, just add both fsr and dlss, do not pick for them what they wanna use, im just glad i can finally use amd relive the way i could never use shadowplay now.


----------



## Felix123BU (Aug 17, 2021)

Kabouter Plop said:


> Honestly pisses me off gamedevs that play favorites, just add both fsr and dlss, do not pick for them what they wanna use, im just glad i can finally use amd relive the way i could never use shadowplay now.


Same here, don't really care which one is better  as long as everybody can use either one or the other if needed. For the first time in like ever I watched CDPR's twitch stream, hoping they would announce FSR and knowing how some reported it to be very easy to implement. Was annoyed and also amused after watching, free DLC's, a cat and 2 shirts  

My 6800XT breezes through the game at Ultra 3440x1440, but goes below the 50 FPS minimum of my Freesync, FSR would keep it above with RT on, hence why I want it.


----------



## Kabouter Plop (Aug 17, 2021)

Its kinda hilarious either way considering cyberpunk is on consoles to they would benefit of FSR being added as well.


----------



## mama (Aug 17, 2021)

Garlic said:


> Yes it does, just don't buy one. Trash value for what it is.


Maybe but what isn't "trash value" at the moment.  Get what you can get if you need something now.



Kabouter Plop said:


> Its kinda hilarious either way considering cyberpunk is on consoles to they would benefit of FSR being added as well.


Agreed but obviously CDPR needs Nvidia on side otherwise it would have happened already.


----------



## Felix123BU (Aug 18, 2021)

mama said:


> Maybe but what isn't "trash value" at the moment.  Get what you can get if you need something now.
> 
> 
> Agreed but obviously CDPR needs Nvidia on side otherwise it would have happened already.


In todays GPU market any card is uber trash value, the 6600XT at MSRP keeping in mind all other prices is decent, but in a normal market its price would be an insulting bad joke to say it nicely.

I am still not sure if CDPR does not add FSR because of Nvidia or just because they don't care or are a bit incompetent, anyone of those can be possible.
One thing that is clear though, if you are a company that went from being the shining knight to being the evil witch, you can not pass up such an opportunity to add a quick fix for a huge portion of your potential player base. Its just bad business not adding FSR at this point in time.

And I am not saying this just because I have a 6800XT that plays the game just fine at 3440x1440 Ultra no RT (75 FPS avg), I can get what people with lower performing cards would feel if they would like to enjoy the game but cant and knowing that there is an easy fix out there that could improve things, but the company does not give a rats ass to offer it to them....


----------



## Kabouter Plop (Aug 18, 2021)

FSR would be great honestly along with HDR @ 5120x1440 with cyberpunk on my 6900 XT i get like 50 fps at this res everything maxed out.


----------



## Nargon33 (Aug 18, 2021)

Felix123BU said:


> In todays GPU market any card is uber trash value, the 6600XT at MSRP keeping in mind all other prices is decent, but in a normal market its price would be an insulting bad joke to say it nicely.
> 
> I am still not sure if CDPR does not add FSR because of Nvidia or just because they don't care or are a bit incompetent, anyone of those can be possible.
> One thing that is clear though, if you are a company that went from being the shining knight to being the evil witch, you can not pass up such an opportunity to add a quick fix for a huge portion of your potential player base. Its just bad business not adding FSR at this point in time.
> ...


You can use something called Magpie to force fsr onto cyberpunk in a janky way. But the performance hit of FSR on old Nvidia cards not designed for it make it not worth it. I tried on an older gtx 1060 3gb. Gained 10 fps for running at a lower resolution and then lost 8 fps of that for running FSR. It improved the image but not back to native. And only gained 2 fps. 

Cyberpunk has Fidelity FX build in so I don't think Nvidia is restricting them hard.


----------



## Felix123BU (Aug 18, 2021)

Nargon33 said:


> You can use something called Magpie to force fsr onto cyberpunk in a janky way. But the performance hit of FSR on old Nvidia cards not designed for it make it not worth it. I tried on an older gtx 1060 3gb. Gained 10 fps for running at a lower resolution and then lost 8 fps of that for running FSR. It improved the image but not back to native. And only gained 2 fps.
> 
> Cyberpunk has Fidelity FX build in so I don't think Nvidia is restricting them hard.


As said before, I am using an app on Steam called Lossless upscaling, with pretty good performance results upscaling from 2648x1108 (which for my native resolution would be the equivalent starting point of FSR Ultra) to 3440x1440 which gives me around  28% extra frames at basically identical quality, so from 42 FPS to 58 FPS with Ultra and RT Reflections on.

The only drawback is that is has to run in windowed mode, and HDR does not really work in windowed mode with this game.


----------



## Hellfire (Aug 19, 2021)

Preparing the rig... Ready to join the club


----------



## Nargon33 (Aug 19, 2021)

Felix123BU said:


> As said before, I am using an app on Steam called Lossless upscaling, with pretty good performance results upscaling from 2648x1108 (which for my native resolution would be the equivalent starting point of FSR Ultra) to 3440x1440 which gives me around  28% extra frames at basically identical quality, so from 42 FPS to 58 FPS with Ultra and RT Reflections on.
> 
> The only drawback is that is has to run in windowed mode, and HDR does not really work in windowed mode with this game.


Sounds exactly like how Magpie works. Windowed mode only. On a powerful gpu at higher resolutions it's more beneficial, but with slower cards it's hardly worth it. I got a 6600xt and it's definitely better at handling it. But since I'm playing on a 1080p monitor I just don't bother, and would rather sacrifice a few frames for native. I'd imagine that card could truly benefit a lot though, since the lower internal resolution you run, the better it performs.


----------



## Felix123BU (Aug 19, 2021)

Nargon33 said:


> Sounds exactly like how Magpie works. Windowed mode only. On a powerful gpu at higher resolutions it's more beneficial, but with slower cards it's hardly worth it. I got a 6600xt and it's definitely better at handling it. But since I'm playing on a 1080p monitor I just don't bother, and would rather sacrifice a few frames for native. I'd imagine that card could truly benefit a lot though, since the lower internal resolution you run, the better it performs.


Yeah, at 1080p the 6600XT is more than enough, as is my 6800XT more than enough for me, but in certain scenarios both would benefit from FSR, like games with RT that break the cards in half   
Not that there is a lot of games where turning on RT is really worth it (CP 2077 is one of them), but its still nice to have the option.


----------



## turbogear (Aug 20, 2021)

Hellfire said:


> View attachment 213223
> 
> Preparing the rig... Ready to join the club


Welcome to the club another 6900XT Liquid Devil Ultimate owner. 
I hope you will enjoy your new card.
I own the same one but I changed the bios to XTXH-LC reference card one. I had to open the card and flash it with ch341a programmer. 
My Devil could not OC memory at all before but after bios change it is working at 2320MHz.

My clock speed for GPU is not going higher than 2785MHz.
If you are into OCing and benchmarking then we will see if you are one of the lucky ones with a card hitting higher than 2800MHz on Time Spy. 

I have also replaced the standard thermal paste with Liquid Metal.
Without LM my card was hitting 95°C under OC higher than 2700MHz. Now temperatures remain below 80°C.

My Time Spy Graphic score is ~24400.
People at HARDWARELUXX are able to hit 25K or higher but under special benchmarking conditions like unstable single run setting or things like putting radiators into Ice bucket. 
My setting of 2585MHz min and 2785MHz max at 1108mV is more a stable setting which could be used on daily basis.
Only thing is I have power targets set very high using MPT to 450A TDC and 500W core GPU power. 

It seems good cooling with loop temperatures in range of 20°C gives extra 200-300 points in Time Spy.
My normal water cooling loop has about 30°C at around 22°C room temperature.

After my vacation, I will see if I should invest into Mo-Ra. 
This thing seems to help a lot for getting good loop temperatures and at same time fans not running at very high speed.





						Watercool MO-RA 420 PRO Black High-End Radiator, 259,95 €
					

Unser MO-RA3 Monsterradiator zählt zu den flexibelsten und leistungsfähigsten Radiatoren auf der Welt. TOP Verarbeitung mit Top Materialen kombiniert ergebe




					shop.watercool.de


----------



## Felix123BU (Aug 20, 2021)

turbogear said:


> Welcome to the club another 6900XT Liquid Devil Ultimate owner.
> I hope you will enjoy your new card.
> I own the same one but I changed the bios to XTXH-LC reference card one. I had to open the card and flash it with ch341a programmer.
> My Devil could not OC memory at all before but after bios change it is working at 2320MHz.
> ...


Now they have to make a case to fit that Mo-Ra


----------



## Hellfire (Aug 20, 2021)

To be fair, cooling wise I already have 2x 480mm x 60mm EK Radiators and room for another 2x 360x60mm Radiators if I need it.


----------



## turbogear (Aug 20, 2021)

Hellfire said:


> To be fair, cooling wise I already have 2x 480mm x 60mm EK Radiators and room for another 2x 360x60mm Radiators if I need it.


What type of case do you use? 
That is amazing that you have that much space.  
My Cosmos c700p has 1x120mm, 1x240mm  and 1x360mm rads inside. 
Theoretical I can fit additional 1x240mm at bottom but the space is tight due all cables coming out of PSU.


----------



## Hellfire (Aug 20, 2021)

turbogear said:


> What type of case do you use?
> That is amazing that you have that much space.
> My Cosmos c700p has 1x120mm, 1x240mm  and 1x360mm rads inside.
> Theoretical I can fit additional 1x240mm at bottom but the space is tight due all cables coming out of PSU.


Corsair 1000D it's an absolute monster


----------



## turbogear (Aug 20, 2021)

Hellfire said:


> Corsair 1000D it's an absolute monster


Oh yes, I know it as I was interested in it but as you said it is quite a monster and need a bit more space than mine.

I want to wish you all in this club a nice time over the next three weeks.
I will leave in the morning for 3 weeks to India.
If my special roaming card works as planned then I will visit here time to time.

You all have fun with your RDNA2 cards. My 6900XTU will have a break from my benchmark torturing. He will be happy to get a break. 
The other 6800XT that is in my son's computer will continue serving him with games while I am away. 
Due to current COVID situation, it's not easy to go as family so I will go alone to visit my parents.


----------



## Hellfire (Aug 20, 2021)

turbogear said:


> Oh yes, I know it as I was interested in it but as you said it is quite a monster and need a bit more space than mine.
> 
> I want to wish you all nice time over the next three weeks.
> I will leave in the morning for 3 weeks to India.
> ...


Good luck and have a safe trip buddy


----------



## turbogear (Aug 20, 2021)

Hellfire said:


> Good luck and have a safe trip buddy


Thanks a lot.


----------



## Butanding1987 (Aug 21, 2021)

turbogear said:


> He will be happy to get a break.
> The other 6800XT that is in my son's computer will continue serving him with games while I am away.
> Due to current COVID situation, it's not easy to go as family so I will go alone to visit my parents.



Enjoy your vacation. And keep safe.


----------



## Felix123BU (Aug 21, 2021)

turbogear said:


> Oh yes, I know it as I was interested in it but as you said it is quite a monster and need a bit more space than mine.
> 
> I want to wish you all in this club a nice time over the next three weeks.
> I will leave in the morning for 3 weeks to India.
> ...


Yup, safe trip and be careful, you can get only one Turbogear, but a Turbogear can get 10 Navi20 cards


----------



## INSTG8R (Aug 21, 2021)

I just caved because I had the cash on hand but got the card I wanted and finally completed my PC upgrades.


----------



## Kabouter Plop (Aug 22, 2021)

Anyone else have Samsung G9 oddyssey ? i wen t from 2 to 1 dead pixel now, however yesterday while gaming screen went black menu of my screen became unresponsive had to unplug display port cable from my PC and reseat it to get image again, no gpu driver crash either as if the screen just tilted, im very stressed, cos store wont replace unit another time except refund me and i just want a proper Samsung G9 unit, and Samsung offes no warranty and hides behind every webshop telling you they should replace it even tho its their responsibility, if had 2 year long fight about warranty before 10 years ago ending up with tinnitus and a refund, how are they allowed to do this is not oke, its not like their is another screen that is ultrawide + 240hz so i really did not even have a choice.


----------



## GerKNG (Aug 22, 2021)

Kabouter Plop said:


> Anyone else have Samsung G9 oddyssey ? i wen t from 2 to 1 dead pixel now, however yesterday while gaming screen went black menu of my screen became unresponsive had to unplug display port cable from my PC and reseat it to get image again, no gpu driver crash either as if the screen just tilted, im very stressed, cos store wont replace unit another time except refund me and i just want a proper Samsung G9 unit, and Samsung offes no warranty and hides behind every webshop telling you they should replace it even tho its their responsibility, if had 2 year long fight about warranty before 10 years ago ending up with tinnitus and a refund, how are they allowed to do this is not oke, its not like their is another screen that is ultrawide + 240hz so i really did not even have a choice.


that's how Samsungs G7 and G9 works. 
a broken pre alpha monitor with more bugs than features.
black screens, crashes, slow restarts, flickering... 
my G7 has the same issues.


----------



## Kabouter Plop (Aug 22, 2021)

GerKNG said:


> that's how Samsungs G7 and G9 works.
> a broken pre alpha monitor with more bugs than features.
> black screens, crashes, slow restarts, flickering...
> my G7 has the same issues.



Odd even my first g9 that i had used for 2 weeks did not go black suddenly out of random going tilt.


----------



## droopyRO (Aug 22, 2021)

I got my RX 6600 XT about a week ago. I got to play around with it this weekend, and made a few comparison clips with the "old" GTX 1070 Ti.
I still have not decided if to keep the 6600 XT (since i got it close to MSRP) but i'm inclined to keep it. I have a few days to make up my mind.
This card handles 1440p really good if you get it close to MSRP. Don't know why AMD marketed it like a 1080p card, maybe just to differencied it from RX 6700 XT.


----------



## Felix123BU (Aug 22, 2021)

droopyRO said:


> I got my RX 6600 XT about a week ago. I got to play around with it this weekend, and made a few comparison clips with the "old" GTX 1070 Ti.
> I still have not decided if to keep the 6600 XT (since i got it close to MSRP) but i'm inclined to keep it. I have a few days to make up my mind.
> This card handles 1440p really good if you get it close to MSRP. Don't know why AMD marketed it like a 1080p card, maybe just to differencied it from RX 6700 XT.


From your vids the 6600XT is around 50% faster. If the 6600XT would not be priced so high it would be a excellent card....


----------



## droopyRO (Aug 22, 2021)

Between 20 and 50% depending on the game or scene. It is not across the board. 
I got the "cheap" (460 Euros) Powercolor Fighter model. All the test are done with an undervolt to 1.09V and the fans set to a maximum of 65%. It gets to about 75ºC with 91ºC hotspot at 2000 rpm. The board weights only 510 grams, the MSI GamingX 1070 Ti is at 1078 grams ! So Powercolor went with the cheapest cooler they found on this model


----------



## GamerGuy (Aug 23, 2021)

droopyRO said:


> Between 20 and 50% depending on the game or scene. It is not across the board.
> I got the "cheap" (460 Euros) Powercolor Fighter model. All the test are done with an undervolt to 1.09V and the fans set to a maximum of 65%. It gets to about 75ºC with 91ºC hotspot at 2000 rpm. The board weights only 510 grams, the MSI GamingX 1070 Ti is at 1078 grams ! So Powercolor went with the cheapest cooler they found on this model


I'm willing to bet good money that the Red Dragon/Red Devil would have fairly good improvement on temps. My Sapphire Nitro+ RX 6900 XT hangs around 70's while gaming (edge temp) with highest hot spot temp that I've seen at 86C (after about an hour of RE Village at 3840x1080, max ingame + RT enabled).


----------



## mrthanhnguyen (Aug 23, 2021)

Got the card and the block. Just dont have time to install it.


----------



## droopyRO (Aug 23, 2021)

GamerGuy said:


> I'm willing to bet good money that the Red Dragon/Red Devil would have fairly good improvement on temps. My Sapphire Nitro+ RX 6900 XT hangs around 70's while gaming (edge temp) with highest hot spot temp that I've seen at 86C (after about an hour of RE Village at 3840x1080, max ingame + RT enabled).


Yup they are, but they were about 50-100$ more expensive at launch. I didn't want to spend extra money at that time. And call me whatever you want, but "red devil" is not a name i want to purchase and neither is "hell hound".

Right now i can't find a single RX 6600 XT close to the launch price they all have a + 40-50% hike in stores. And there are people trying to sell 6600 XT at or above RTX 3060 Ti prices on the used market. And it is still summer holiday season, things might go nuts as we get closer to Black Friday and Christmas.


----------



## Butanding1987 (Aug 23, 2021)

I need to hit 23k. Almost there.


----------



## jesdals (Aug 23, 2021)

If your not going with watercooling of your cards - take a look at this


----------



## Valantar (Aug 23, 2021)

jesdals said:


> If your not going with watercooling of your cards - take a look at this


It's fascinating how fan shrouds are constantly being "discovered" by people all over the place. (Also, why didn't he extend his shroud to the front, so that the bottom front intake fan could also blow directly into the GPU?) There's a reason why you'll find shrouds everywhere in HEDT workstations and the like (which typically have _terrible_ airflow affordances by consumer design standards, yet manage to stay perfectly cool and often rather quiet). They work. Keep heat sources in separated flow paths with discrete intakes and exhausts, and things improve quite massively.


----------



## Felix123BU (Aug 23, 2021)

Valantar said:


> It's fascinating how fan shrouds are constantly being "discovered" by people all over the place. (Also, why didn't he extend his shroud to the front, so that the bottom front intake fan could also blow directly into the GPU?) There's a reason why you'll find shrouds everywhere in HEDT workstations and the like (which typically have _terrible_ airflow affordances by consumer design standards, yet manage to stay perfectly cool and often rather quiet). They work. Keep heat sources in separated flow paths with discrete intakes and exhausts, and things improve quite massively.


There is also a reason why fan shrouds never caught on  
Radiator Push vs. Pull vs. Shroud Testing V2 | martinsliquidlab.wordpress.com | Page 7

What the dude in the vid discovered is that getting good airflow to a GPU actually helps, who would have figured


----------



## Valantar (Aug 23, 2021)

Felix123BU said:


> There is also a reason why fan shrouds never caught on
> Radiator Push vs. Pull vs. Shroud Testing V2 | martinsliquidlab.wordpress.com | Page 7


Well, yes, the main issue is that they by definition can't be universal for DIY (socket placements, GPU placement and size, etc vary too much) and adjustable ones look like garbage and perform worse than made-to-fit ones. And expecting even experienced users to DIY an airflow guiding shroud to create separate flow paths for components is pretty unreasonable (especially as shrouds generally get in the way of showing off components). Plus the complications added by open-air GPUs with various exhaust orientations, etc. That didn't stop me from making a foamcore board shroud for the CPU cooler and exhaust fan on my DIY NAS though  Works wonderfully too.

Btw, from what I can understand, the "shroud" in your link is a spacer between the fan and radiator? If so, that serves a rather dramatically different purpose than a shroud guiding flow paths within a case, and thus isn't really applicable.


----------



## Felix123BU (Aug 23, 2021)

Valantar said:


> Well, yes, the main issue is that they by definition can't be universal for DIY (socket placements, GPU placement and size, etc vary too much) and adjustable ones look like garbage and perform worse than made-to-fit ones. And expecting even experienced users to DIY an airflow guiding shroud to create separate flow paths for components is pretty unreasonable (especially as shrouds generally get in the way of showing off components). Plus the complications added by open-air GPUs with various exhaust orientations, etc. That didn't stop me from making a foamcore board shroud for the CPU cooler and exhaust fan on my DIY NAS though  Works wonderfully too.
> 
> Btw, from what I can understand, the "shroud" in your link is a spacer between the fan and radiator? If so, that serves a rather dramatically different purpose than a shroud guiding flow paths within a case, and thus isn't really applicable.


Yes, was referring to "standard" fan shrouds, if there is such a thing   

But I can totally see a custom made "shroud" for very specific purposes working very well, especially if it gets the air moving to areas inside a case that have very poor air circulation. Thinking of it, if one would have a 3D printer, there would be a lot of cool things one can do for better airflow inside a case.


----------



## Valantar (Aug 23, 2021)

Felix123BU said:


> Yes, was referring to "standard" fan shrouds, if there is such a thing
> 
> But I can totally see a custom made "shroud" for very specific purposes working very well, especially if it gets the air moving to areas inside a case that have very poor air circulation. Thinking of it, if one would have a 3D printer, there would be a lot of cool things one can do for better airflow inside a case.


Yeah, there definitely is. I mean, look at something like a Lenovo ThinkStation P920:




That's essentially two enclosed airflow tunnels (CPU+RAM and PSU) with two semi-open tunnels for AICs (enclosed when the side panel is on, but not enclosed between AICs) in between these. Each of which has its own separate airflow path with no heat dumped across component classes - and allows them to cool a dual-CPU workstation with up to four huge accelerators with just three case fans (one small fan in the front for each AIC area, and one larger rear exhaust for the CPU tunnel. What allows Lenovo (and HP, and Dell, and all proper workstation OEMs) to do this is that they have complete control over the case design, motherboard layout, heatsinks, etc. Only the AICs are outside of their control, and are thus left in more open - but still isolated - flow paths.

Doing the same with a DIY PC essentially requires custom fabrication as no two cases are identical, socket positioning varies between motherboards, heatsinks vary wildly in size, etc. But it has _huge_ potential for improving cooling, instead of the brute-force, highly turbulent ATX-standard induced solution of "take a box, add various fans as desired/needed".


----------



## Felix123BU (Aug 23, 2021)

Valantar said:


> Yeah, there definitely is. I mean, look at something like a Lenovo ThinkStation P920:
> 
> 
> 
> ...


If I would not have a fully watercooler setup that does not need something like this, I would definitely look into something like the above. Before adding the water setup, I was actually thinking about better airflow for my reference 6800XT, I bet that could have benefited a lot from something similar.

But I do love the water-cooled build, air is fine, water is another level of cooling, not even close if done right, not with all the ducts and fans could my 6800XT oc-ed to 2.7ghz and increased power allowance stay in the 40's while having 300w flow through it 

But for normal air cooling, this type of air flow control could do small wonders come to think of it.

And btw, this type of info could be really helpful for people on air who want to improve their setups and are adventurous enough


----------



## droopyRO (Aug 23, 2021)

jesdals said:


> If your not going with watercooling of your cards - take a look at this


I think that the AIO is throwing a lot of hot air in to the case. Now, if that AIO was mounted topside than that shroud would not be so effective since the ambient would be much lower.


----------



## GamerGuy (Aug 24, 2021)

mrthanhnguyen said:


> View attachment 213900Got the card and the block. Just dont have time to install it.


Nice! Send it to me, I've got all the time in the world to install the block for you as I'm retired. My only concern is, that it might get 'lost' on the way back to you....


----------



## Space Lynx (Aug 24, 2021)

jesdals said:


> If your not going with watercooling of your cards - take a look at this



I've done this before, removed those PCI slots. Haven't finished whole video yet, but yeah I never went that hardcore with it or designed a shroud. I imagine the shroud is what made it work so well, not sure why he said the original shroud looked like a dumpster fire, looked great to me. I would say he needs a magnetic dust filter through on back of that case where the Noctua 80mm is sucking in air. Even if it adds 2-3 Celsius worth it in the long run.

I wouldn't mind getting one of these shrouds and giving it a go, but I am not skilled enough to make one of my own, thoughts?


----------



## Felix123BU (Aug 24, 2021)

lynx29 said:


> I've done this before, removed those PCI slots. Haven't finished whole video yet, but yeah I never went that hardcore with it or designed a shroud. I imagine the shroud is what made it work so well, not sure why he said the original shroud looked like a dumpster fire, looked great to me. I would say he needs a magnetic dust filter through on back of that case where the Noctua 80mm is sucking in air. Even if it adds 2-3 Celsius worth it in the long run.
> 
> I wouldn't mind getting one of these shrouds and giving it a go, but I am not skilled enough to make one of my own, thoughts?


I am really liking the idea too, it should not be super difficult to make something like this, like buy some 0.5mm plastic sheet, cut it into shape and glue it together, should be resistant enough if done right, pity my card is no longer air cooled, would have jumped on this idea


----------



## Space Lynx (Aug 24, 2021)

Felix123BU said:


> I am really liking the idea too, it should not be super difficult to make something like this, like buy some 0.5mm plastic sheet, cut it into shape and glue it together, should be resistant enough if done right, pity my card is no longer air cooled, would have jumped on this idea



I'm sticking with air cooled everything for life, because water scares the living crap out of me, especially around parts that are not easily replaced cause of the endless shortages...


----------



## turbogear (Aug 24, 2021)

Felix123BU said:


> Yup, safe trip and be careful, you can get only one Turbogear, but a Turbogear can get 10 Navi20 cards


Thanks friend for well wishes. 
I arrived safely at my parent's place.
I am still 3 short of 10 Navi2 that I could have had in my hand for short times.  



lynx29 said:


> I'm sticking with air cooled everything for life, because water scares the living crap out of me, especially around parts that are not easily replaced cause of the endless shortages...


I don't use regular water but these very low conductive fluids.
These low conductive fluids that I use with my cooling loop are no problem.
I spilled a few times some on various components because sometimes I am a little bit clumsy when opening the loop for maintainance. 
I did not get any component damaged because of that until now.
One time the flow sensor usb connection stopped working after I spilled some liquid but it was not because of short circuit but the liquid dried inside the pins and it caused an open circuit / no connection. 
Cleaning the connector made it work.


----------



## Valantar (Aug 24, 2021)

lynx29 said:


> I'm sticking with air cooled everything for life, because water scares the living crap out of me, especially around parts that are not easily replaced cause of the endless shortages...


I originally abandoned water cooling back in ... 2008-2009 after having a scare (tilting my case over without properly closing the fill port on my res, _just_ missing my GPU and just barely getting a few drops on the motherboard). I got over that fear back in 2017, and I'm not looking back - there's no way I would have been able to set up as small, quiet and powerful a build as I currently have without water cooling. I understand the anxiety, but with some care it's honestly quite fine. A bit nerve wracking at times, sure, but it's really not dangerous unless you make it so.

When assembling my current loop I was lazy in my fill routine and ended up spilling a decent amount down my CPU block/pump and onto the PSU, inlcuding onto several output connectors and seams in the PSU casing. Naturally, that wasn't particularly pleasant, but I just dried everything off thoroughly, including opening up the PSU to check if water had gotten inside (it hadn't), plus using my compucleaner to blow out any water stuck in hard to reach places (including the pin housing on one PSU connector). No damage on anything, and it has worked perfectly ever since. The moral of the story: be careful when filling, don't try to take shortcuts that make you spill stuff, but even if you do it'll be fine. Just take things apart and dry them before connecting them to power. (And obviously don't have anything connected to power when filling or bleeding the loop.) Worst case scenario you'll need some alcohol or electrical contact cleaner to remove residues. It's a bit of work, but breaking anything is really hard.


----------



## turbogear (Aug 25, 2021)

Valantar said:


> I originally abandoned water cooling back in ... 2008-2009 after having a scare (tilting my case over without properly closing the fill port on my res, _just_ missing my GPU and just barely getting a few drops on the motherboard). I got over that fear back in 2017, and I'm not looking back - there's no way I would have been able to set up as small, quiet and powerful a build as I currently have without water cooling. I understand the anxiety, but with some care it's honestly quite fine. A bit nerve wracking at times, sure, but it's really not dangerous unless you make it so.
> 
> When assembling my current loop I was lazy in my fill routine and ended up spilling a decent amount down my CPU block/pump and onto the PSU, inlcuding onto several output connectors and seams in the PSU casing. Naturally, that wasn't particularly pleasant, but I just dried everything off thoroughly, including opening up the PSU to check if water had gotten inside (it hadn't), plus using my compucleaner to blow out any water stuck in hard to reach places (including the pin housing on one PSU connector). No damage on anything, and it has worked perfectly ever since. The moral of the story: be careful when filling, don't try to take shortcuts that make you spill stuff, but even if you do it'll be fine. Just take things apart and dry them before connecting them to power. (And obviously don't have anything connected to power when filling or bleeding the loop.) Worst case scenario you'll need some alcohol or electrical contact cleaner to remove residues. It's a bit of work, but breaking anything is really hard.


Same here. 
I have two water-cooled PCs running at home and no damage since I started water-cooling in 2013.
Filling is non problematic if the reservoir is located at place without directly being on top of PSU. 
I have mine located at front of case in both systems. If liquid spills during filling it goes to bottom of the case and onto room floor as my cases have ventilation grills at bottom.
I do filling with a large 150ml syringe so I rarely spill liquid while filling.
I spilled sometimes some liquid when taking out GPU but as mentioned before usual a small amount. 
Only ones I was unlucky that it landed on the flow sensor and I had to clean the internal USB connect on top of flow sensor because it got socked with liquid which dried inside it and connection was broken.
This Compucleaner is really helpful. 
I also own one.


----------



## Butanding1987 (Aug 26, 2021)

I've finally broken the 23k mark (overall score). I think that's it for the 6800 XT.


----------



## Nargon33 (Aug 27, 2021)

jesdals said:


> If your not going with watercooling of your cards - take a look at this


I'd be curious to know how well that works on a case that already has good ventilation from the front. A lot of prebuilds have disgustingly bad airflow, especially on the front. There is cases with 3 fans in the front blowing air against a glass panel with zero actual intake, that just swirl case air around in circles and do nothing. Any case with no air intake is going to see huge temperature improvements from even a single 80mm fan that actually does something.


----------



## nolive721 (Aug 29, 2021)

sold my AIO hybrid EVGA 1080Ti few days ago and bought a 6700XT white Hellbound

Sold the 1080ti for $700 and bought the 6700xt powercolor for $800

Runs very quiet and with a bit of UVing relatively cool as well

Power consumption reduction vs the nvidia is great and perfs are what the reviews and benchmark were telling me so happy with my decision in the end for a $100 upgrade in these crazy GPU prices days

the only concern is the idle temps,I was hoovering at around 35degC with the Ti, of course watercooled, and I am above 45degC with the 6700XT but granted in a warm room and with zero rpm feature activated as well


----------



## Space Lynx (Aug 29, 2021)

nolive721 said:


> sold my AIO hybrid EVGA 1080Ti few days ago and bought a 6700XT white Hellbound
> 
> Sold the 1080ti for $700 and bought the 6700xt powercolor for $800
> 
> ...



yeah zero rpm is a great feature these days. i'm fine with the higher idle temps as long as they aren't to bad, 45 is great imo.  the less dust the better.  always remember, dust is our arch enemy... lol


----------



## Valantar (Aug 29, 2021)

nolive721 said:


> sold my AIO hybrid EVGA 1080Ti few days ago and bought a 6700XT white Hellbound
> 
> Sold the 1080ti for $700 and bought the 6700xt powercolor for $800
> 
> ...


That sounds like a pretty great upgrade, and for not a lot of money at all!

Regarding the idle temperatures: don't worry. Seriously. Your fans are off, yet the GPU sits at "above 45 degrees" - that's nothing at all. If it was going above 60 or something like that I would expect the fans to kick on, which any GPU with a zero-rpm feature will do. Passively cooling a GPU in a warm PC case, even at idle, will result in some temperature increase (especially when your reference is water cooling). And it's perfectly fine in every concievable way. Enjoy the silence - the only thing you'd achieve by disabling zeor rpm mode is wearing out your fans earlier than what is necessary.


----------



## nolive721 (Aug 29, 2021)

Yes, I really think it would have taken much much longer for a similar $100 upgrade indeed like waiting for next Gen cards and to be honest, happy to be back in the RED camp already
	

	
	
		
		

		
			





thanks for the advice regarding idle temps, I will try not to bother as you said


----------



## Felix123BU (Aug 29, 2021)

lynx29 said:


> yeah zero rpm is a great feature these days. i'm fine with the higher idle temps as long as they aren't to bad, 45 is great imo.  the less dust the better.  always remember, dust is our arch enemy... lol


Dust may have something to do about my obsession with dust filters on cases and only buying cases that have dust filters everywhere    I hate dust buildup 



nolive721 said:


> sold my AIO hybrid EVGA 1080Ti few days ago and bought a 6700XT white Hellbound
> 
> Sold the 1080ti for $700 and bought the 6700xt powercolor for $800
> 
> ...


6700XT white Hellbound - that card looks great , and 45ish idle with no fans spinning is perfectly fine.


----------



## nolive721 (Aug 30, 2021)

Thanks

the card looks great indeed
Now horizontally mounted in a Q300L case that I use as my Gaming rig

I might modify it in the future to allow GPU vertical mounting for the card to show off even more as I like the Fan led implementation 

it was one of the drivers for buying  it

I had few yrs ago a powercolor 480 Red devil that I had heavily OCed and Bios modified as well 
It served me well for almost 2yrs so it’s also a trust in the AIB that I am buying in with this 6700xt

I was eying at an higher tier card like the 6800 xt but prices were extremely high over here in Japan

if I use my rig for 4K gaming in the relatively near future then I am banking on the benefits of extra features I don’t use today like SAM or Fidelity FX to reach the butter smooth 4K 60hz

I might then go for a 4K Freesync panel that will just eliminate the need of another GPU upgrade before a good 3 to 5yrs I hope


----------



## Butanding1987 (Aug 30, 2021)

Time to overclock that 6700 XT. Let’s go.


----------



## nolive721 (Aug 30, 2021)

I know I know and I will trust me

but priority is Undervolting right now


----------



## Butanding1987 (Aug 30, 2021)

Noooooooo.


----------



## nolive721 (Aug 30, 2021)

OK OK, just to avoid giving the wrong impression

I had pushed the OC on my VEGA64LC I had as well as the Ti at some point, using this setting below 

It was good for 27,000points Graphic score in Firestrike but Heaven benchmark was really low at around 100fps if my memory is right






I did a very very first Heaven run by maxing out the OC in RADEON as shown below and the card seemed stable, it gave me a score of around 150fps at 1080p extreme tessalation and Ultra Quality without CPU OC and in windows mode.
from my testing with my GPUs,I could add few more fps with OCed CPU and Full screen, surprisingly or not

will do some 3D mark testing when time permits




that was too much tempting. FS with stock CPU and the OC on the 6700XT as I mentioned above

will do some more tweaking later this week but feel free to comment on the Graphics score please


----------



## Butanding1987 (Aug 30, 2021)

Try keeping a 200MHz difference between minimum and maximum clock frequencies. That should improve performance. Check if your VRAM OC is giving you higher scores and there's no regression. The goal for now is 40,000 graphics points.  Wow, I didn't know the 6700 XT had max 1.2v.


----------



## nolive721 (Aug 30, 2021)

Might be a bit of a challenge looking at my current 35k points mark
I guess i need to forget about my Vega and 10series OCing habits when i am going to play more seriously with the 6700xt

Thanks for the preliminary advice but i might come back to you soon

I apologize as I see that I am posting a lot and being late to the 6000series party, this might have been discussed already (its a big thread, still reading sorry)
anyway, I did some further runs yesterday very late at night to test the best OC possible with my card

These are the settings that allow me to pass just above 3600points, by creating the 200Mhz gap between Min and Max core frequency and reducing the core voltage to around 1100mV

My gut feeling is the card is not a great OCer (but that's fine, I don't need this extra power right now anyway) although its not really a turd.

The benchmarks I have seen on Japanese websites (I live here) for any 6700Xt are telling me that scores varies between 35 and 36k points so I am right there at 35k stock and then 36k OCed

in the 3Dmark database for this SKU, I see only very few people at 37k point with similar temps to mine (low 60degC)

for scores from 38k to 40k, cards temps go below 40degC so obviously these cards are watercooled allowing for higher core frequencies

As I said,I am not too much bothered at this point in time, UVing to run my card cool&quiet is the priority and working well in my 2 Display type of set-up, but yes I appreciate any feedback and further guidance so to see if I can do a very final tweaking with this GPU

thanks a lot


----------



## nolive721 (Aug 31, 2021)

pushed the Core Fq to max 2800Mhz and that is going to be my final OC effort, I am contentTop10 in the world with the combo 3500X/6700XT and without watercooled GPU


----------



## Valantar (Aug 31, 2021)

nolive721 said:


> pushed the Core Fq to max 2800Mhz and that is going to be my final OC effort, I am contentTop10 in the world with the combo 3500X/6700XT and without watercooled GPU
> 
> View attachment 214895


Nice! How does it perform in Time Spy?


----------



## zerasse (Aug 31, 2021)

Hello!

I just upgraded from 5700xt Red Devil to 6700xt reference card made by Powercolor. 
Card is limited to 215w by default as you know and i used MPT to push it to 240w as i've read many have done.
I got like 1k points more in Timespy and it was stable while temps were like Core 72c max, mem junct 100c max and mem were in reasonable temp, cant remember what it was tho.
OC was min 2500mhz max 2800mhz 1,2v mem fast timings and maxed out 2150.

The question is, is it safe to run the reference card 240w max in the long run if the temps are fine and the system is stable?
I've seen that the custom models have 2x8 pin connector as my card has only 8+6. 

Thank you for your help in advance!


----------



## Felix123BU (Aug 31, 2021)

zerasse said:


> Hello!
> 
> I just upgraded from 5700xt Red Devil to 6700xt reference card made by Powercolor.
> Card is limited to 215w by default as you know and i used MPT to push it to 240w as i've read many have done.
> ...


8+6 is 150w+75w=225w, another 75w max from the PCIE port, you get to a max of 300w theoretically.
There should generally be no serious issues with 240w in your case, no electronics device is built without some tolerances, as in it wont break if you pump a bit more, within reasonable limits. But that's like any OC your decision, you want it or not. I would not worry about 240w, I would worry more about temps generally speaking.


----------



## Butanding1987 (Sep 1, 2021)

nolive721 said:


> pushed the Core Fq to max 2800Mhz and that is going to be my final OC effort, I am contentTop10 in the world with the combo 3500X/6700XT and without watercooled GPU


Never say never.  



Felix123BU said:


> 8+6 is 150w+75w=225w, another 75w max from the PCIE port, you get to a max of 300w theoretically.
> There should generally be no serious issues with 240w in your case, no electronics device is built without some tolerances, as in it wont break if you pump a bit more, within reasonable limits. But that's like any OC your decision, you want it or not. I would not worry about 240w, I would worry more about temps generally speaking.


Listen to @Felix123BU, who has forced his GPU to guzzle 450w during a Time Spy run.


----------



## nolive721 (Sep 1, 2021)

Is there a good tutorial you could advise then for executing MPT on an 6700xt card?
Thanks


----------



## Felix123BU (Sep 1, 2021)

Butanding1987 said:


> Never say never.
> 
> 
> Listen to @Felix123BU, who has forced his GPU to guzzle 450w during a Time Spy run.


Naaah, only 380w (270w being default limit), could not bare to go higher even though I am reasonably sure the card can take it


----------



## eidairaman1 (Sep 1, 2021)

Well here comes the 6900XTX talks. I think AMD should wait for the 3090Ti Launch or just make RDNA3 all out for RX 7990XT (Miss the pro name for Top Dog Cards).


----------



## nolive721 (Sep 1, 2021)

Valantar said:


> Nice! How does it perform in Time Spy?


pretty good I think looking at some benchmark ONLINE and 3DMark database  what do you think?


----------



## Valantar (Sep 1, 2021)

nolive721 said:


> pretty good I think looking at some benchmark ONLINE and 3DMark database  what do you think?
> 
> View attachment 215061


I don't really have a frame of reference for 6700 XTs, but that looks good to me! Interesting to see how this stacks up against my undervolted 6900 XT (~180W IIRC) and at stock (~330W). At pretty much exactly half the compute resources, you can get _a lot_ more than half the performance just by pushing power higher. Of course, my UV results also make me curious just how low power a 6700 XT die could go. I lost ~15% performance for a ~45% power cut from stock, so if that transfers over to your card, you'd be looking at less than 120W(!) for a TS score of ... I don't know what your baseline was, but something like ~10 000 graphics score? That sounds like wishful thinking to me, but it would be really cool to see.


----------



## jesdals (Sep 1, 2021)

Any who gets hard restart at cold boot? I have had a coupple of days with it but no other problems after new chipset driver to my AMD x570 setup - but seems to be AMD driver related - currently running 21.7.2 driver


----------



## nolive721 (Sep 1, 2021)

Valantar said:


> I don't really have a frame of reference for 6700 XTs, but that looks good to me! Interesting to see how this stacks up against my undervolted 6900 XT (~180W IIRC) and at stock (~330W). At pretty much exactly half the compute resources, you can get _a lot_ more than half the performance just by pushing power higher. Of course, my UV results also make me curious just how low power a 6700 XT die could go. I lost ~15% performance for a ~45% power cut from stock, so if that transfers over to your card, you'd be looking at less than 120W(!) for a TS score of ... I don't know what your baseline was, but something like ~10 000 graphics score? That sounds like wishful thinking to me, but it would be really cool to see.


thank you, I also think the card is not too bad from references I see around the Web. But I also see people with much higher score than mine in TS of FS but I also see these people with much lower temps/higher clocks so I assume they have watercooled their cards.

I was not aware of this MPT tool, which I assume could be comparable to the powerplay table editor that was/still is used on VEGA gen, although I was not using it on my VEGA64 LC as I was using Wattman and the Memory Tweaker tool to make most of my GPU.
The day I go 4K I might visit this MPT topic more in details but any guidance already today is appreciated.

My current approach though, that is related to my 2 Gaming display set-up, is focusing on UV to limit power use while maintaining a >60fps no stuttering experience and the 6700XT is delivering so I have no problem right now.

This is the very basic UV setting I am running now, didn't really look at synthetic benchmarks but monitoring power use in Games, I am around the 100-120W max mark so really I can not complain.

I might do a bit more deep dive analysis when time permits and share if you are interested. Maybe pushing the UV further if the card can go lower in Voltage/Fqcy combination


----------



## zerasse (Sep 2, 2021)

Thank you for your help about the power limit for reference 6700xt.
I guess i got lucky in the silicon lottery as the results atm are following in 3dmark?
Will continue tweaking and trying to lower the voltage.

In case you're wondering my setup is following

Asus Prime z490-P
10700k @ 5ghz
3500mhz DDR4 CL16
6700XT Reference from Powercolor.

Sorry for the amount of pictures


----------



## nolive721 (Sep 2, 2021)

Impressive score indeed


----------



## INSTG8R (Sep 2, 2021)

Clearing my “stock” score by 1000 points  and my Max  Boost is 2622 of course it’s not hitting that in Timespy but I’ve seen it over 2600 gaming but it’s usually running 2570ish


----------



## nolive721 (Sep 4, 2021)

INSTG8R said:


> Clearing my “stock” score by 1000 points  and my Max  Boost is 2622 of course it’s not hitting that in Timespy but I’ve seen it over 2600 gaming but it’s usually running 2570ish
> View attachment 215250


so out of curiosity, what were your Timespy stock and after OCing?



zerasse said:


> Thank you for your help about the power limit for reference 6700xt.
> I guess i got lucky in the silicon lottery as the results atm are following in 3dmark?
> Will continue tweaking and trying to lower the voltage.
> 
> ...


when you say you worked on the power limit to reach this score, was that in the RADEON drivers or using this MorePowerTool(MPT) I have seen being mentioned on this thread?

thanks!


----------



## Garlic (Sep 4, 2021)

nolive721 said:


> so out of curiosity, what were your Timespy stock and after OCing?
> 
> 
> when you say you worked on the power limit to reach this score, was that in the RADEON drivers or using this MorePowerTool(MPT) I have seen being mentioned on this thread?
> ...


MPT is a way to modify the power limit beyond the radeon driver's 15%. It's basically needed if you are going to do OC.

Does anyone know if the sapphire nitro+ rx6000 series have a vram temp sensor? I can't find one on gpuz or any other software.


----------



## Butanding1987 (Sep 4, 2021)

Garlic said:


> MPT is a way to modify the power limit beyond the radeon driver's 15%. It's basically needed if you are going to do OC.
> 
> Does anyone know if the sapphire nitro+ rx6000 series have a vram temp sensor? I can't find one on gpuz or any other software.


HWiNFO displays it as "GPU Memory Junction Temperature".


----------



## nolive721 (Sep 4, 2021)

Garlic said:


> MPT is a way to modify the power limit beyond the radeon driver's 15%. It's basically needed if you are going to do OC.
> 
> Does anyone know if the sapphire nitro+ rx6000 series have a vram temp sensor? I can't find one on gpuz or any other software.


yes I am across what the MPT tool would do as I noticed this NAVI gen is locked at 15% Power increase only in RADEON driver , I have been testing and using similar tools on my POLARIS and VEGA cards in the past. Its just that I haven't found a proper tutorial to run  this MPT on a 6700XT card


----------



## Butanding1987 (Sep 4, 2021)

nolive721 said:


> yes I am across what the MPT tool would do as I noticed this NAVI gen is locked at 15% Power increase only in RADEON driver , I have been testing and using similar tools on my POLARIS and VEGA cards in the past. Its just that I haven't found a proper tutorial to run  this MPT on a 6700XT card


You basically just load your BIOS into MPT and tweak power limits (GPU wattage). You don't need to touch the other settings. Then click "Write SPPT" to save your soft power play table in the registry. Restart. So if you put 350w, you'll get 350w + 15% or 53w or total power of 403w.

Most of the MPT settings don't make sense and don't result in substantial performance improvements. Some settings could mess up your GPU (for example, changing voltage numbers), while others could blow up you card (GFX amp).


----------



## nolive721 (Sep 4, 2021)

ok just this like 186W is stock so I can try bumping it to say 200W






and nothing else, not even this?


----------



## Butanding1987 (Sep 4, 2021)

nolive721 said:


> ok just this like 186W is stock so I can try bumping it to say 200W
> View attachment 215399
> 
> 
> ...


That's correct.


----------



## nolive721 (Sep 4, 2021)

It worked well
Just did a FS run and my score bumped to 37300 Graphics where I had not seen better than 36k before

I set the new power limit at 200W for this first attempt but will do some more testing tomorrow with higher target to see how the card behaves

silly questions maybe but.....

Do you need to load MPT and apply the power increase every time you open Windows? I am asking because I am running another FS run right now with my same OC and score is back to 36k with power output relatively low during the run

Another one, if I decide to remove this Extra Power limit allowed for my card for whatever reason at some point, is it as simple as using "delete SPPT" in MPT?

thanks a lot and sorry again if these questions have been asked thousands times here.


----------



## Butanding1987 (Sep 5, 2021)

nolive721 said:


> It worked well
> Just did a FS run and my score bumped to 37300 Graphics where I had not seen better than 36k before
> 
> I set the new power limit at 200W for this first attempt but will do some more testing tomorrow with higher target to see how the card behaves
> ...


The settings are saved in the registry so you don't need MPT anymore unless you want to change the power limit. When benching it's a good idea to monitor your clocks and power using GPUZ just to make sure you're hitting your goal. A driver crash can lower your clocks but a shutdown will help you recover your desired clocks (Radeon Software also resets everything to default when it senses a Windows or driver crash). About going back to default, yes it's as easy as "Delete SPPT". Don't forget to restart.


----------



## zerasse (Sep 5, 2021)

I think i've now pushed my 6700xt reference card to it's limits  this is my highest score in Timespy.
I'm really satisfied with the card. I ordered thermalpads for the VRMs and VRAM and lets see if i can then get the number 1 spot .


----------



## Butanding1987 (Sep 6, 2021)

zerasse said:


> I think i've now pushed my 6700xt reference card to it's limits  this is my highest score in Timespy.
> I'm really satisfied with the card. I ordered thermalpads for the VRMs and VRAM and lets see if i can then get the number 1 spot .


Solid score.


----------



## INSTG8R (Sep 6, 2021)

nolive721 said:


> so out of curiosity, what were your Timespy stock and after OCing?


Haven’t even tried, happy with my stock performance so far.


----------



## nolive721 (Sep 6, 2021)

What are you scores stock then? Can you share?


----------



## INSTG8R (Sep 6, 2021)

nolive721 said:


> What are you scores stock then? Can you share?


I'm usually on Beta drivers but here is one on public drivers I can share








						I scored 12 390 in Time Spy
					

AMD Ryzen 5 5600X, AMD Radeon RX 6700 XT x 1, 32768 MB, 64-bit Windows 10}




					www.3dmark.com


----------



## droopyRO (Sep 8, 2021)

MSI RX 6600 XT Gaming X Review | Total Warhammer 2 | GPU & Displays
					

Total Warhammer 2




					www.overclock3d.net
				




Dose anyone here play Total War Warhammer 2 ? my "old" GTX 1070 Ti is as good as my RX 6600 XT in the ingame benchmarks. I tested it on a i5 8600k@5Ghz and on a stock 5600X. But in that review(the only one i found with TWW2 and 6600 XT) the 6600XT scores way more than the GTX 1080, 78 vs 53 fps on average at 1440p. 
Has anyone here played Warhammer 2 with a 6600 XT how dose it perform ? Thanks.


----------



## Felix123BU (Sep 8, 2021)

droopyRO said:


> MSI RX 6600 XT Gaming X Review | Total Warhammer 2 | GPU & Displays
> 
> 
> Total Warhammer 2
> ...


Played it some time ago, that game is super CPU dependent, not so much GPU bound as CPU bound. That bench was done with a Ryzen 3950X, and this game loves cores more than any other I have played


----------



## droopyRO (Sep 8, 2021)

@Felix123BU

I love Warhammer series, i have over 2000 hours in it. But in my experience is that it loves frequency/IPC not the number of cores. I had less performance when i went from a 8600K to a 2700X. The problem here is that, if those graphs are true, i should have about 50% more performance in this benchmark. And i get maybe like 10-20% more performance, see this video i made:


----------



## Felix123BU (Sep 8, 2021)

droopyRO said:


> @Felix123BU
> 
> I love Warhammer series, i have over 2000 hours in it. But in my experience is that it loves frequency/IPC not the number of cores. I had less performance when i went from a 8600K to a 2700X. The problem here is that, if those graphs are true, i should have about 50% more performance in this benchmark. And i get maybe like 10-20% more performance, see this video i made:


One thing I noticed while looking at that bench link you posted was the weirdly small difference at 1080p between all GPU's. which in my opinion is either CPU bound or a artificial limit imposed by the engine.
Only 20 FPS between a 6600XT and a 6900XT, and the 6600XT being faster that a 3080 TI  
The 2700X was not as good at gaming as the 8600K, but the 3950X is another story, and in a game that utilizes cores really well, that could be the difference you are looking for.  Could also be ram speed and latency, tuned and faster ram can make a big difference in certain scenarios, especially CPU bound games. I played a lot with Shadow of the tomb  raider, got around +25FPS just by tuning ram.


----------



## droopyRO (Sep 9, 2021)

No way a 3950X accounts for 50% more performance in any game compared to a 5600X. As for RAM, "big" as in 1-3% ?  i noticed no difference between 3200 and 3600 or CL 16 and 14 in any real life scenario in a year since i moved to AMD. First a 2700X and now a 5600X. The biggest upgrade for gaming was going from the 2700X to 5600X. But it was a wrong move, i delided my 8600K a few months ago, and at 5.1Ghz is almost as good as the 5600X stock in games. I guess i have to not trust reviews that much 

Still, if anyone here has a 6600 XT and or a GTX 1080 Ti, please tell me what your ingame benchmark's are for TW Warhammer 2. Thanks.


----------



## Felix123BU (Sep 9, 2021)

droopyRO said:


> No way a 3950X accounts for 50% more performance in any game compared to a 5600X. As for RAM, "big" as in 1-3% ?  i noticed no difference between 3200 and 3600 or CL 16 and 14 in any real life scenario in a year since i moved to AMD. First a 2700X and now a 5600X. The biggest upgrade for gaming was going from the 2700X to 5600X. But it was a wrong move, i delided my 8600K a few months ago, and at 5.1Ghz is almost as good as the 5600X stock in games. I guess i have to not trust reviews that much
> 
> Still, if anyone here has a 6600 XT and or a GTX 1080 Ti, please tell me what your ingame benchmark's are for TW Warhammer 2. Thanks.


"No way a 3950X accounts for 50% more performance in any game compared to a 5600X" mmmm, it potentially could  I am not saying its 50% more, but I was curious about this whole point, so went on and installed TWW2, used the same presets as that OC3D test you posted first, and used my 6800XT as comparison, its not the 6600XT you have, but it says a lot about system differences, see below:

I have a 5800X@4.7ghz all core, with 3800mhz CL16 tuned ram

1080p Ultra preset, I get 174.8 FPS average, the benchmark gets 110 FPS for the 6800XT - that's 62% more for the exact same graphics card 





1440p Ultra Preset, I get 118.9 FPS average, the benchmark gets 105 FPS for the 6800XT - that's 13% more





Now, as said, I know its not a 6600XT, but for me its pretty clear that CPU and RAM have a huge influence on FPS in this game, and to bring out the best of a GPU in this game you first and foremost need a very strong CPU-RAM combo. Hope that gives you an idea about what's going on with your 6600XT.


----------



## droopyRO (Sep 9, 2021)

Thank you for taking the time to install it.
What benchmark is that the Battle one ? I will try a memory and CPU overclock, but as you can see in the video i made. The GPU hits 99% most of the time, only in the Skaven benchmark that is where you see a CPU limitation. IIRC my RAM can do 3600 when OC, but i never tried to OC my CPU.

I found a video i made when i had a RTX 2080(sad that i had to RMA that card) and quickly made a comparison to my 6600XT. In the Battle bench, i have 58 fps on average, then that means that your 6800XT is pushing out more than double what a 6600XT can. That is huge.


----------



## QuietBob (Sep 9, 2021)

droopyRO said:


> All the test are done with an undervolt to 1.09V and the fans set to a maximum of 65%.


In your TWW2 video the 6600 XT reaches 84c, which seems unusually high for a Navi 23. Most review samples were under 65c.
Is it showing edge temperature when undervolted?


----------



## droopyRO (Sep 9, 2021)

You reffering to the above video ? No undervolt, or overclock. Just set the fans to 60%. The higher number is the hot spot. 110ºC is the higher limit from what i gather. Mine rarely goes past 90-92ºC while gaming. This days the ambient temperature is lower so it hovers about 88-90ºC. But if you want lower hot spot temperatures, stay away from certain models like Fighter, Pulse or the Biostar one


----------



## QuietBob (Sep 9, 2021)

droopyRO said:


> You reffering to the above video ? No undervolt, or overclock. Just set the fans to 60%. The higher number is the hot spot. 110ºC is the higher limit from what i gather. Mine rarely goes past 90-92ºC while gaming. This days the ambient temperature is lower so it hovers about 88-90ºC. But if you want lower hot spot temperatures, stay away from certain models like Fighter, Pulse or the Biostar one


I actually watched this one but only now did I see the other temp reading. So 69c for the edge sensor and 84c (or even 92c max) for the hot spot would be more in line with the reviews.
Thanks for the recommendations, already got my 6600XT  I'd share my results, but I don't have any Total War games in my library...


----------



## Felix123BU (Sep 10, 2021)

droopyRO said:


> Thank you for taking the time to install it.
> What benchmark is that the Battle one ? I will try a memory and CPU overclock, but as you can see in the video i made. The GPU hits 99% most of the time, only in the Skaven benchmark that is where you see a CPU limitation. IIRC my RAM can do 3600 when OC, but i never tried to OC my CPU.
> 
> I found a video i made when i had a RTX 2080(sad that i had to RMA that card) and quickly made a comparison to my 6600XT. In the Battle bench, i have 58 fps on average, then that means that your 6800XT is pushing out more than double what a 6600XT can. That is huge.


Battle bench, base game without any DLCs. CPU overclock wont probably have a huge impact, but RAM OC and subtiming tightening might. Also, at what resolution do you play? If 1080p, that should have the biggest gains with RAM tuning, less so at higher resolutions.


----------



## droopyRO (Sep 10, 2021)

1440p, i just upload to YT at 1080p for speed. I will try to OC my RAM and see if it makes a difference. I was contemplating to downgrade to a 1080p monitor. But i still prefer a big 32" at 1440p and a slower framerate, than a smaller 24-27" 1080p monitor with 60+ fps. Thanks again.


----------



## Butanding1987 (Sep 12, 2021)

Are you running the game uncapped? Also, check your GPU power usage in GPUZ. Your 6600 XT's power usage is abnormally low (80+w compared with 200+w on the 2080)  You might have inadvertently lowered the power limit? Can't think of anything else.


----------



## Palindrome (Sep 12, 2021)

Do excuse the poor image quality, my phone really doesn't have the greatest of cameras  Either way, here's my Asus RX 6600 XT Dual OC, got it for 390€ before taxes and shipping, which is a deal I'm pretty happy with considering the current market situation 


Spoiler


----------



## droopyRO (Sep 12, 2021)

Butanding1987 said:


> Are you running the game uncapped? Also, check your GPU power usage in GPUZ. Your 6600 XT's power usage is abnormally low (80+w compared with 200+w on the 2080)  You might have inadvertently lowered the power limit? Can't think of anything else.


Why would i cap my framerate or power target and complain  Fresh driver install after DDU from Safe Mode, no performance tuning. I posted in the official game's forum too. Maybe someone there can tell if it is a game or driver issue. Thanks.


----------



## Iarwa1N (Sep 18, 2021)

Hi guys, my 6800xt red devil won’t post at gen 4 x16, only gen 3 x16 on my msi b550 unify x board. I tried differentt bioses on the motherboard but that didn’t solve the issue. I want to try another 6800xt bios on my gpu but I don’t know which bioses are compatible with my red devil.


----------



## Psychoholic (Sep 18, 2021)

I Guess I can join the club.. main desktop is a 3080, but just got an All AMD Laptop (5900X/6800M)


----------



## Butanding1987 (Sep 18, 2021)

Iarwa1N said:


> Hi guys, my 6800xt red devil won’t post at gen 4 x16, only gen 3 x16 on my msi b550 unify x board. I tried differentt bioses on the motherboard but that didn’t solve the issue. I want to try another 6800xt bios on my gpu but I don’t know which bioses are compatible with my red devil.


Make sure you're using PCI_E1, the slot closest to the CPU. The other slot that connects to the chipset is limited to PCIe Gen3. Nice board, by the way.


----------



## Iarwa1N (Sep 18, 2021)

Butanding1987 said:


> Make sure you're using PCI_E1, the slot closest to the CPU. The other slot that connects to the chipset is limited to PCIe Gen3. Nice board, by the way.


Thanks for suggestion but I already use the E1 slot and I don’t have any m.2 ssd either, only 1 sata ssd and 1 sata hdd is connected. I just tried disabling all integrated devices, onboard audio, bluetooth, wifi, network adapter, disabled c states, and also disconnected all my case header connectors, disconnected all usb devices but still booted with Pcie gen 3 x16. I love the board and thats why I am willing to try another bios on my gpu, maybe a newer dated bios then my red devils will solve this issue.


----------



## Valantar (Sep 18, 2021)

Iarwa1N said:


> Hi guys, my 6800xt red devil won’t post at gen 4 x16, only gen 3 x16 on my msi b550 unify x board. I tried differentt bioses on the motherboard but that didn’t solve the issue. I want to try another 6800xt bios on my gpu but I don’t know which bioses are compatible with my red devil.


A simple sanity check: are you using any riser cable/vertical gpu mount of any kind?


----------



## Iarwa1N (Sep 18, 2021)

Valantar said:


> A simple sanity check: are you using any riser cable/vertical gpu mount of any kind?


No, the gpu connected directly to the motherboard. CPU is 5900X and have 2x16 40000/C16 kit. I tried Cmos reset and bios defaults without any kind of PBO or FCLK overclock but same result.


----------



## nolive721 (Sep 18, 2021)

Did you try the force pcie gen 4 option that should be somewhere in your BIOS?


----------



## Iarwa1N (Sep 18, 2021)

nolive721 said:


> Did you try the force pcie gen 4 option that should be somewhere in your BIOS?


Yes, didn’t worked as well.


----------



## Valantar (Sep 18, 2021)

I sincerely doubt a GPU BIOS update will change anything. Most likely there's a signal integrity issue with either your GPU or motherboard. Without another 4.0 GPU to test with it's very difficult to know for sure, but that would be my guess.


----------



## Felix123BU (Sep 18, 2021)

Iarwa1N said:


> Hi guys, my 6800xt red devil won’t post at gen 4 x16, only gen 3 x16 on my msi b550 unify x board. I tried differentt bioses on the motherboard but that didn’t solve the issue. I want to try another 6800xt bios on my gpu but I don’t know which bioses are compatible with my red devil.


Firstly, I would not try writing another GPU bios currently, that might end in pain, well, except if its released by the GPU manufacturer specific for that model.
Secondly, I would suspect the motherboard bios, but as others have said, the best way to check what is going on is either try another PCIe 4 capable card on your MB, or try your 6800XT on another board that supports PCIe 4.
This is the first time I hear something like this, I certainly hope you where not unlucky with a bad 6800XT, those things are so illusive it would be a shame


----------



## Iarwa1N (Sep 19, 2021)

Felix123BU said:


> Firstly, I would not try writing another GPU bios currently, that might end in pain, well, except if its released by the GPU manufacturer specific for that model.
> Secondly, I would suspect the motherboard bios, but as others have said, the best way to check what is going on is either try another PCIe 4 capable card on your MB, or try your 6800XT on another board that supports PCIe 4.
> This is the first time I hear something like this, I certainly hope you where not unlucky with a bad 6800XT, those things are so illusive it would be a shame


I thought by now it would be common to flash bioses to 6000 series and I would be safe since Red Devil has a bios switch, if something goes wrong I could always switch to other bios and save the card. I live in Turkey and I bought the motherboard and gpu when I was in US so rma process will be hard for me to do and nobody I know here has switched to gen 4 capable motherboard yet. Thats why I thought if it works, the bios flashing would be the easiest solution for me.


Valantar said:


> I sincerely doubt a GPU BIOS update will change anything. Most likely there's a signal integrity issue with either your GPU or motherboard. Without another 4.0 GPU to test with it's very difficult to know for sure, but that would be my guess.


Actually I replaced the factory paste of the gpu, maybe I touched the PCIE connection part and leave some paste. I will remove and clean the PCIE connector of the gpu when I get home.


----------



## Valantar (Sep 19, 2021)

Iarwa1N said:


> I thought by now it would be common to flash bioses to 6000 series and I would be safe since Red Devil has a bios switch, if something goes wrong I could always switch to other bios and save the card. I live in Turkey and I bought the motherboard and gpu when I was in US so rma process will be hard for me to do and nobody I know here has switched to gen 4 capable motherboard yet. Thats why I thought if it works, the bios flashing would be the easiest solution for me.
> 
> Actually I replaced the factory paste of the gpu, maybe I touched the PCIE connection part and leave some paste. I will remove and clean the PCIE connector of the gpu when I get home.


You repasted a brand new GPU? Why? It seems like you have some bad habits you need to unlearn:
- unless your thermals are bad and there is indication of a problem, repasting is unnecessary for the first few years of a GPU's life. The couple of degrees difference better paste might make is Irrelevant to performance.
- a BIOS update is NOT a likely fix for whatever random problem one might have with a GPU. It _might_ be, but that requires someone to actually identify that there is a bug in that BIOS. If not, you're just risking breaking something for no good reason.

Your reasoning here is "I have a weird problem - > a BIOS update will probably fix it", which is a huge leap. A GPU BIOS update is a last resort, not an early troubleshooting fix. Chances are equally large that you'll brick the card - just look at all the threads on these forums asking for help due to doing just that.


----------



## Iarwa1N (Sep 19, 2021)

Valantar said:


> You repasted a brand new GPU? Why? It seems like you have some bad habits you need to unlearn:
> - unless your thermals are bad and there is indication of a problem, repasting is unnecessary for the first few years of a GPU's life. The couple of degrees difference better paste might make is Irrelevant to performance.
> - a BIOS update is NOT a likely fix for whatever random problem one might have with a GPU. It _might_ be, but that requires someone to actually identify that there is a bug in that BIOS. If not, you're just risking breaking something for no good reason.
> 
> Your reasoning here is "I have a weird problem - > a BIOS update will probably fix it", which is a huge leap. A GPU BIOS update is a last resort, not an early troubleshooting fix. Chances are equally large that you'll brick the card - just look at all the threads on these forums asking for help due to doing just that.


I repasted it because I had a huge difference between gpu temperature and junction temperature, it was like mac gpu temperature 75 and junction was like 100-113, the card was hitting max junction temperature with that huge cooler. After repasting with Noctua NH-1 my new temps were gpu 80, junction 90-100.

so still not a way to cross flash this cards without bricking them? I would like to try a compatible msi 6800xt bios to see what would happen if the board and gpu has the same msi branded bioses.


----------



## Iarwa1N (Sep 21, 2021)

An update to my situation; I got a chance to try my 6800xt on another motherboard which is a Asus x570 and even without installing drivers the gpu worked at gen 4 x16 without a problem. I think my motherboard is faulty.


----------



## zerasse (Sep 21, 2021)

Hello again!

I read on Guru3D forums that the new driver 21.9.1 Radeon Software has metrics for CPU temps also for overlay.
I suppose this option is only available for systems that has Ryzen CPU because i can't see the metrics on my Radeon overlay?
I tried this on Win11 and Win10.

Other question is about Windows Hardware Accelerated GPU Scheduling (HAGS). On Win 10 GPU-Z says that it's not on and not supported.
On latest Dev build of Win11 GPU-Z tells me that its not on but supported. Although i can't see option for that in the Windows graphics options were it should be putted on?
Is here someone who can use HAGS and/or give me advice that is it possible to turn on somehow?

Thank you in advance!


----------



## turbogear (Sep 24, 2021)

Iarwa1N said:


> An update to my situation; I got a chance to try my 6800xt on another motherboard which is a Asus x570 and even without installing drivers the gpu worked at gen 4 x16 without a problem. I think my motherboard is faulty.


Good that you were able to find the root cause.  
Like @Felix123BU, I was also thinking that this does not sound like 6800XT problem.

Either the motherboard or the PCI-e connector could be an issue. 
Sometimes it is just some weird bios issue from mainboard that could be gone if you clear the CMOS.
I had a weird issue myself last weekend when I put together new computer for my son.
On ASUS Crosshair VIII hero mainboard, the CPU fan connector was not detecting the DDC pump and the pump would run at full speed.
I thought it was problem with DDC pump RPM control cable as it was not being detected on any fan port on the mainboard. I took appart the cable plug and pump to check if there is issue with the cable or connection on DDC pcb. 
This did not help.
I thought pump RPM circuit was broken and as last resort, I thought let's clear the mainboard bios maybe issue lies there.
To my surprise it was some bios weird issue, after I cleared the CMOS the DDC pump was detected and I could control the speed with AI Suite. 



zerasse said:


> Hello again!
> 
> I read on Guru3D forums that the new driver 21.9.1 Radeon Software has metrics for CPU temps also for overlay.
> I suppose this option is only available for systems that has Ryzen CPU because i can't see the metrics on my Radeon overlay?
> ...


I did not spend much time on my computer in the last days since I came back from trip to India.
I need to check if I can see CPU on matric for my 5900X with new driver.
What I did notice when I put together all new AMD machine (5800x and 6800XT) for my son is that on Wattman page with 21.9.1 there was a new field to overclock the Ryzen process through Wattman.
It used to have only GPU Ocing there.


----------



## Felix123BU (Sep 24, 2021)

turbogear said:


> Good that you were able to find the root cause.
> Like @Felix123BU, I was also thinking that this does not sound like 6800XT problem.
> 
> Either the motherboard or the PCI-e connector could be an issue.
> ...


Indeed, the Bios of today tends to be the root problem for a lot of weird issues, including some software not running or running weirdly, they have become more complex and with extra complexity comes extra potential issues. One of the first things I check if things do not behave as they should. Back in the day, one could run with the initial bios for years, nowadays bios updates are in many cases mandatory.


----------



## turbogear (Sep 24, 2021)

Felix123BU said:


> Indeed, the Bios of today tends to be the root problem for a lot of weird issues, including some software not running or running weirdly, they have become more complex and with extra complexity comes extra potential issues. One of the first things I check if things do not behave as they should. Back in the day, one could run with the initial bios for years, nowadays bios updates are in many cases mandatory.


Yes, especially with Ryzen machines the bios is in contant development by AMD like Radeon Software updates.  
They are still releasing Agesa updates.
I have latest bios on both machines.


----------



## Felix123BU (Sep 24, 2021)

Any of you 6000 series owners plan on upgrading to Alder Lake CPU's when they come out? I would be very curios how the 6000 series behaves with the upcoming CPUs from Intel


----------



## turbogear (Sep 24, 2021)

Felix123BU said:


> Any of you 6000 series owners plan on upgrading to Alder Lake CPU's when they come out? I would be very curios how the 6000 series behaves with the upcoming CPUs from Intel


No Adler Lake for me. 
If anything, I will upgrade my 5900X with Zen 3 3D V-Cache processor that AMD plans to release this year or start of next year. 









						AMD presents more details on Zen 3 3D V-Cache and the future of 3D stacking
					

Apparently, AMD was considering implementing Intel's Foveros 3D technology at some point, but later decided to go with TSMC's superior Micro Bump 3D packaging that is 1 micron thinner and quite a bit more efficient. This is only the beginning, as AMD plans to refine the interconnect pitch in the...




					www.notebookcheck.net


----------



## droopyRO (Sep 24, 2021)

Is it me, or dose the image quality out of the box is better on AMD than on Nvidia ? both in games and in a desktop envoirement. Last time i had an AMD card was in 2014. Since then i only had Nvidia, and after a month or so of using the 6600 XT i switched back to the 1070 Ti to do some tests. And the colors and sharpness are less vibrant and blurrier. I used the same monitor. Is it placebo or ?

And the second thing. I played around with integer scaling. On a 1440p monitor i have to drop down to 720p to make it work but the image quality is poor. And then i tried 1080p + integer on a 4K TV. It looks really good, i am contemplating getting a 4K TV myself and use it as monitor. The only problem is the 60Hz refresh rate. But the bigger size, image sharpness and higher framerate than 1440p make it an interesting option for RTS and RPG type games, maybe even for sims like DCS World or Dirt 2.0
Do you guys have any experience with integer scaling and 4K resolution ?


----------



## Felix123BU (Sep 24, 2021)

droopyRO said:


> Is it me, or dose the image quality out of the box is better on AMD than on Nvidia ? both in games and in a desktop envoirement. Last time i had an AMD card was in 2014. Since then i only had Nvidia, and after a month or so of using the 6600 XT i switched back to the 1070 Ti to do some tests. And the colors and sharpness are less vibrant and blurrier. I used the same monitor. Is it placebo or ?
> 
> And the second thing. I played around with integer scaling. On a 1440p monitor i have to drop down to 720p to make it work but the image quality is poor. And then i tried 1080p + integer on a 4K TV. It looks really good, i am contemplating getting a 4K TV myself and use it as monitor. The only problem is the 60Hz refresh rate. But the bigger size, image sharpness and higher framerate than 1440p make it an interesting option for RTS and RPG type games, maybe even for sims like DCS World or Dirt 2.0
> Do you guys have any experience with integer scaling and 4K resolution ?


First, I also had that experience recently, borrowed a friends 2070s to see with my own eyes a comparison between DLSS and FSR, and the first thing I noticed was the slightly washed out colors with the Nvidia card, it was nothing major, but I just felt is was more pale, could have been my imagination. There is a old as time discussion on color compression being different, but I am not really sure if the difference is something clearly quantifiable. And in the end you can adjust colors anyway, so doubt there is any real big differences.

Regarding integer scaling, I can say only one thing, having tried both, FSR is hugely better at upscaling, integer does not come even close in quality. There are apps that can apply FSR on any windowed game, and that is a much better option quality wise from my point of view.


----------



## droopyRO (Sep 24, 2021)

I forgot about those kind of programs. Someone told me about one, but i found that app doubious since it was in Chinese. And another one that was on Steam for 5$. But at the same time, if they work on Nvidia also, then i should sell this 6600XT and stick to my old 1070Ti that would get a nice boost with FSR. I will make a note and test it out when i'm off work.


----------



## Palindrome (Oct 7, 2021)

I believe I mentioned it in another thread at some point, but I feel like my Heaven score is somewhat too low..


Spoiler









Any 6600XT owners able to do a run at 1080p max settings so I have something to compare to? My system specs are on my profile, I'm aware that my platform being PCIe 3.0 and the card being 4.0 and only x8 will lower performance somewhat, but I do feel like the card should be scoring higher. Don't get me wrong, I doubled my score coming from an RX580, so I'm very content regardless. Just.... paranoia that something is wrong I suppose


----------



## droopyRO (Oct 7, 2021)

Shoot me a PM later so i don't forget to run it when i get home.


----------



## Palindrome (Oct 8, 2021)

droopyRO said:


> Shoot me a PM later so i don't forget to run it when i get home.


I didnt check on this thread till just now, my bad  there's no rush though, as long as you can get a benchmark run in with a 6600XT I'll be thankful


----------



## droopyRO (Oct 9, 2021)

Palindrome said:


> benchmark run in with a 6600XT


Heaven Benchmark i5 8600K@ 5100Mhz, Z370 mobo, 24GB RAM DDR3000, RX 6600 XT Fighter from Powercolor with a slight undervolt.


----------



## Palindrome (Oct 14, 2021)

droopyRO said:


> Heaven Benchmark i5 8600K@ 5100Mhz, Z370 mobo, 24GB RAM DDR3000, RX 6600 XT Fighter from Powercolor with a slight undervolt.
> View attachment 220105


Thanks, that calms my nerves then


----------



## kiddagoat (Oct 22, 2021)

After having my card for about 6 months now, fixing the Powercolor mounting issue..... and the RGB header issue.....  I am very content with my card after some tweaking.  

I was still having a 30C delta between my edge temp and my hotspot temp.  I took a look at some of the forums around here and looked at undervolting. 

Currently I am sitting at a -10% Power Limit and 1100mV while keeping a max clock of around 2500MHz.  My hotspot never goes over 80C now.  Running 3DMark and other stress tests, my hotspot would get up to 95C.  

I don't think a thermal paste swap would help much.  I tossed a tube of MX-2 I had that was over 10 years old..... I used the EK Ecotherm this time around on my CPU and GPU.  I noticed that the EK paste is a bit more of a thinner viscosity compared to the MX-2 or even the IC Diamond paste I have used in the past.


----------



## Felix123BU (Oct 22, 2021)

kiddagoat said:


> After having my card for about 6 months now, fixing the Powercolor mounting issue..... and the RGB header issue.....  I am very content with my card after some tweaking.
> 
> I was still having a 30C delta between my edge temp and my hotspot team.  I took a look at some of the forums around here and looked at undervolting.
> 
> ...


Probably the cold plate is not perfectly even, in those cases a thermal pad can be better than paste, but only in limited scenarios. But clocks of 2.5ghz and only 80c on the hotspot is excellent for an air cooler


----------



## Dristun (Oct 31, 2021)

Ladies, fellas, good day, sorry for barging in. Is there any actual difference in cooling between Sapphire's Nitro+ 6600XT and Pulse 6600XT? Any owners or people lucky to compare here maybe? Judging by the photos on their website the heatsink looks exactly the same - it's just that one of the fans is for some reason reversed on the Nitro+. These two are somehow the cheapest 6600XTs available here now and might just buy one cause 1660 just doesn't cut it for smooth 1440p in 2042 and Vanguard anymore. Thanks in advance.


----------



## Valantar (Oct 31, 2021)

Dristun said:


> Ladies, fellas, good day, sorry for barging in. Is there any actual difference in cooling between Sapphire's Nitro+ 6600XT and Pulse 6600XT? Any owners or people lucky to compare here maybe? Judging by the photos on their website the heatsink looks exactly the same - it's just that one of the fans is for some reason reversed on the Nitro+. These two are somehow the cheapest 6600XTs available here now and might just buy one cause 1660 just doesn't cut it for smooth 1440p in 2042 and Vanguard anymore. Thanks in advance.


They do indeed look essentially identical, and I wouldn't expect two coolers that similar to perform noticeably differently. If I were to guess, I'd assume the Nitro has a better binned GPU and possibly more VRM phases populated (i.e. the Pulse might leave one or two out). Sapphire tends to make excellent coolers and GPUs, so I would guess you're getting a good card either way. Just go with whichever suits you best, whether that's down to price, looks, or something else.


----------



## HD64G (Oct 31, 2021)

Valantar said:


> They do indeed look essentially identical, and I wouldn't expect two coolers that similar to perform noticeably differently. If I were to guess, I'd assume the Nitro has a better binned GPU and possibly more VRM phases populated (i.e. the Pulse might leave one or two out). Sapphire tends to make excellent coolers and GPUs, so I would guess you're getting a good card either way. Just go with whichever suits you best, whether that's down to price, looks, or something else.


Agreed! The differences for such a low power GPU aren't enough to spend even $30 more on Nitro vs Pulse.


----------



## lunatik (Nov 5, 2021)

Hi everyone, happened to find this forum looking for answer about mpt.
Anyone know what could be tweaked on 6700xt? Have not touched mpt yet..
Could you only change max power or  smth else too? Downloaded 3dmark timespy and with 5600x/6700xt i got 1st place atleast with 1 gpu overall (13459).


----------



## Butanding1987 (Nov 6, 2021)

lunatik said:


> Hi everyone, happened to find this forum looking for answer about mpt.
> Anyone know what could be tweaked on 6700xt? Have not touched mpt yet..
> Could you only change max power or  smth else too? Downloaded 3dmark timespy and with 5600x/6700xt i got 1st place atleast with 1 gpu overall (13459).


Basically you just increase the GPU power limit (watts). I suggest you leave GFX TDC limit (amp) and SOC TDC limit (amp) alone.


----------



## lunatik (Nov 6, 2021)

Butanding1987 said:


> Basically you just increase the GPU power limit (watts). I suggest you leave GFX TDC limit (amp) and SOC TDC limit (amp) alone.


Seems that power limit is not a problem for me since i could not improve my score. I went from 211 to 235 +15% (270W total). Most likely will still try and change smth else when i get the time to learn about it more.


----------



## mechtech (Nov 6, 2021)

Was hoping to get an RX6600 this Christmas, but it's not looking good lol  oh well.


----------



## Venom_Chris (Nov 11, 2021)

Hello all new here just posting the Information I have to help someone who might google my setup in fufture.

Short background; My last rig consisted of an Asus Ares II gpu and a i7 3960x I've been away from the tech game for that long (about 6 years)

Anyway decided to get back into it with a laptop. Alienware Area 51m (desktop i9-9900k, RTX 2080) It runs HOT with both the GPU and CPU. I'm taking the 9900k hits 100c daily multiple times 101c sometimes.

Decided to try an Egpu with an Asus ROG Strix LC OC 6800xt .

Egpu Is Alienware Graphics amp running at 3.0 4x PCIE.

Card works flawlessly and Is a beast even in an egpu at 3.0 4x PCIE.

My firestrike graphics score Is hitting about 56k+ with only a 2.5ghz overclock, nothing on memory.

Annoyingly though my card can do max clocks just not In firestrike. It will sit at 2755mhz all day In games and stress tests like Superposition 8k, Furmark, Metro Exodus enhanced extreme with ray tracing, OCCT, Heaven, multiple games, etc.

Any Idea why firestrike crashes as soon as It's loaded unless It's down to about 2.5Ghz? It's super annoying. I'm certain this card would do over 2.8 If It wasn't locked off.

Oh, also, I tried It In my laptop with a tired old I7 7700HQ (so a mobile cpu) to see how It scaled - It scales amazingly. This GPU does not discriminate. I'll upload a pic of gpu-z later when on that rig.

Thanks for reading. Hope it's useful to someone In future who's looking for such Information.

Edited* I'm actually running at PCIE 3.0 4x lanes not 8x. Figured I'd correct that. I also hit 2.8Ghz briefly this morning.

Second edit* Thinking about trying MPT but wondering If that will achieve anything as I'm already at max? Assume not and I've no clue what I'm doing with it.


----------



## Butanding1987 (Nov 11, 2021)

Fire Strike needs A LOT of voltage to run at 2,800 MHz on the 6800 XT — definitely more than ~1.25v.


----------



## sepheronx (Nov 11, 2021)

mechtech said:


> Was hoping to get an RX6600 this Christmas, but it's not looking good lol  oh well.


I just picked one up.

It isn't a good card by any measure.  It isn't worth the price and I actually regret the purchase.


----------



## Felix123BU (Nov 11, 2021)

Venom_Chris said:


> Hello all new here just posting the Information I have to help someone who might google my setup in fufture.
> 
> Short background; My last rig consisted of an Asus Ares II gpu and a i7 3960x I've been away from the tech game for that long (about 6 years)
> 
> ...


Your main problem seems to be that the CPU is overheating and throttling down, which can affect gaming performance and benchmarks. I looked at my Firestrike Ultra  and my physics (CPU - 28636)) score is 2.7x better than yours, and even though I run a 5800x, it is by no means 2.7 times faster, which probably means your CPU is constantly overheating and performing badly.

As for the 2800mhz, most Navi 2 cards have a hard limit at ~2.7ghz, as @Butanding1987 said, a lot more voltage is needed to get passed that, and its usually not worth it performance wise. The biggest gain from my personal experience is raising the power limit over the maximum allowed with MPT. Best results in Firestrike I got at around 380w TGP used, ~275w being stock max power usage, the performance increase over stock was around 25% with a combo of power and frequency.


----------



## Venom_Chris (Nov 11, 2021)

Felix123BU said:


> Your main problem seems to be that the CPU is overheating and throttling down, which can affect gaming performance and benchmarks. I looked at my Firestrike Ultra  and my physics (CPU - 28636)) score is 2.7x better than yours, and even though I run a 5800x, it is by no means 2.7 times faster, which probably means your CPU is constantly overheating and performing badly.
> 
> As for the 2800mhz, most Navi 2 cards have a hard limit at ~2.7ghz, as @Butanding1987 said, a lot more voltage is needed to get passed that, and its usually not worth it performance wise. The biggest gain from my personal experience is raising the power limit over the maximum allowed with MPT. Best results in Firestrike I got at around 380w TGP used, ~275w being stock max power usage, the performance increase over stock was around 25% with a combo of power and frequency.


The firestrike ultra screenshot was with a 7700hq mobile cpu to show you can get half decent performance with lesser cpu's . My 9900k runs at 5ghz all core <80c now the RTX 2080 is out the equation.

This is a normal firestrike run with 9900k (hyperthreading disabled keep forgetting to enable it) which i don't think Is too bad considering egpu at 3.0 4x lanes.

And yeah think I'm gonna be happy with what I got and not tinker anymore. It's good performance. Really tempted to sell this laptop and make a 12900k desktop. Give this card the CPU and Pcie lanes it deserves! Love it.


----------



## mechtech (Nov 11, 2021)

sepheronx said:


> I just picked one up.
> 
> It isn't a good card by any measure.  It isn't worth the price and I actually regret the purchase.


No cards are good value right now.  Also I would only purchase one if I could get it for $475 Cad $ or less.  From the charts it’s quite a bit of a jump over my rx480, but my rx480 exceeds my needs so if I have it for another 5 years.  Fine by me. More money in my wallet.


----------



## Valantar (Nov 11, 2021)

sepheronx said:


> I just picked one up.
> 
> It isn't a good card by any measure.  It isn't worth the price and I actually regret the purchase.


That sounds overly harsh - the reviews I've seen makes it look like a decent competitor. Is there anything beyond the ridiculous state of GPU pricing making you say that?


----------



## Space Lynx (Nov 11, 2021)

I miss my rx 6800. Thing was such a powerhouse for $579. These markets really suck.


----------



## sepheronx (Nov 11, 2021)

Valantar said:


> That sounds overly harsh - the reviews I've seen makes it look like a decent competitor. Is there anything beyond the ridiculous state of GPU pricing making you say that?



Well, the card is a 5700 in performance while costing a small fortune ($660 CAD after taxes at a retailer).  I paid same price for a RTX 3060.

I guess if it was about $400 CAD then it would be a better buy for sure.


----------



## Felix123BU (Nov 11, 2021)

lynx29 said:


> I miss my rx 6800. Thing was such a powerhouse for $579. These markets really suck.


What happened to your 6800?


----------



## Valantar (Nov 11, 2021)

sepheronx said:


> Well, the card is a 5700 in performance while costing a small fortune ($660 CAD after taxes at a retailer).  I paid same price for a RTX 3060.
> 
> I guess if it was about $400 CAD then it would be a better buy for sure.


Doesn't it perform the same as the 3060 (except for RT)?

I completely agree that current prices are garbage, but that's universal. IMO, both the 6600 and 3060 are $300 or below GPUs.


----------



## droopyRO (Nov 11, 2021)

Valantar said:


> both the 6600 and 3060 are would have been $300 or below GPUs.


FTFY 
I don't see a way out of this mess. Not even in 2022. Maybe in 2023 and only if Etherum 2.0 or whatever it will be called rolls out. And Intel, AMD and Nvidia pump up their GPU yield. Then maybe we will see a price drop to MSRP + 25-50%. Because ATM the prices are MSRP +100-125% where i live.


----------



## Felix123BU (Nov 11, 2021)

droopyRO said:


> FTFY
> I don't see a way out of this mess. Not even in 2022. Maybe in 2023 and only if Etherum 2.0 or whatever it will be called rolls out. And Intel, AMD and Nvidia pump up their GPU yield. Then maybe we will see a price drop to MSRP + 25-50%. Because ATM the prices are MSRP +100-125% where i live.


I live where you live, and holy hell, yeah, even in normal times there is price gouging here, when I think back at getting my 6800xt at basically MSRP, I suddenly realize the amount of good will the universe bestowed upon me


----------



## Space Lynx (Nov 12, 2021)

Felix123BU said:


> What happened to your 6800?



i sold my desktop pc awhile ago. at the time i didn't care really, cause i thought my life was going to go in a different trajectory of new friends, going out more, etc. turned out not to be the case as most humans on this planet are bs.  

i guess gaming is all there is, game, read, then die of old age.  neat times we live in.


----------



## sepheronx (Nov 12, 2021)

Valantar said:


> Doesn't it perform the same as the 3060 (except for RT)?
> 
> I completely agree that current prices are garbage, but that's universal. IMO, both the 6600 and 3060 are $300 or below GPUs.


It's a tad bit slower than 3060 but also doesn't have same features as you said. While RT is pointless for most part on 3060, mixed with DLSS, it can be playable.
  Plus if one likes video editing, it's better too.

Yes, both are poor value.  Im curious what 3050's will be.


----------



## Valantar (Nov 12, 2021)

droopyRO said:


> FTFY


Nah, I was commenting on what they actually are, not what Nvidia and AMD want them to be and are forcing people to consider them as. They are $300-or-below GPUs, but they are sold and marketed as something more than that (and scalpers and unscrupulous distributors and retailers make them something like 2x that, of course).


sepheronx said:


> It's a tad bit slower than 3060 but also doesn't have same features as you said. While RT is pointless for most part on 3060, mixed with DLSS, it can be playable.
> Plus if one likes video editing, it's better too.
> 
> Yes, both are poor value.  Im curious what 3050's will be.


If the mobile 3050 series is anything to go by, unimpressive, sadly. At least the 6600 has efficiency going for it, which might make the 6500 a 75W card and open the door for slot powered and even low profile versions. (Of course they could make the 3050 that low power as well, but it would be very slow.)


----------



## mechtech (Nov 12, 2021)

C


sepheronx said:


> I just picked one up.
> 
> It isn't a good card by any measure.  It isn't worth the price and I actually regret the purchase.


Can you not just return it and get your money back?   What make model you get?


----------



## nolive721 (Nov 13, 2021)

I have undervolted further my 6700xt with 2200mhz core clock no vram frequency touched and -1 on power slider

the card is drawing between 75 and 100W max as reported in HWinfo 

this is amazing as I am using this on a 720p plasma TV with VSR ON in Radeon drivers to achieve 1440p resolution 
I play sim racing titles with all graphics maxed out and SOL plus CSP mod in Assetto Corsa so I can’t be happier

remember I come from 1080ti and Vega64. and in similar condition I was drawing between 150 and 200w so it’s really great

yes I paid $700 for the card but it was only $100 added to what I got from my 1080ti sale so I would not complain here


----------



## Space Lynx (Nov 13, 2021)

nolive721 said:


> I have undervolted further my 6700xt with 2200mhz core clock no vram frequency touched and -1 on power slider
> 
> the card is drawing between 75 and 100W max as reported in HWinfo
> 
> ...



well done well done!


----------



## Felix123BU (Nov 13, 2021)

nolive721 said:


> I have undervolted further my 6700xt with 2200mhz core clock no vram frequency touched and -1 on power slider
> 
> the card is drawing between 75 and 100W max as reported in HWinfo
> 
> ...


I did something similar with my 6800xt since after playing along with overclocking the card, I decided I don't need it in anything I play, so went also to 2200mhz (lower wont work for some reason), and now it barely goes over 100w if I keep it locked at 60hz, and around  150w if I stick to the 100hz of my monitor. Its amazing how these cards can be so super energy efficient if you downclock them a little bit.

My whole PC now uses somewhere around 240w, that's basically console power usage, for a much more powerful system. It would use even less if not for the water pump and the 6 fans


----------



## nolive721 (Nov 13, 2021)

Felix123BU said:


> I did something similar with my 6800xt since after playing along with overclocking the card, I decided I don't need it in anything I play, so went also to 2200mhz (lower wont work for some reason), and now it barely goes over 100w if I keep it locked at 60hz, and around  150w if I stick to the 100hz of my monitor. Its amazing how these cards can be so super energy efficient if you downclock them a little bit.
> 
> My whole PC now uses somewhere around 240w, that's basically console power usage, for a much more powerful system. It would use even less if not for the water pump and the 6 fans


exactly my thoughts and drivers here. The only scenario where the GPU is at default, not even a slight overclock, is when I switch Gaming to my triples 1080p 75Hz rig that is driven by the same PC in the room

but for whatever reasons it happens less and less these days

I even push the envelop to undervolt my Ryzen CPU and most of time even gaming with Windows Power saver mode on while managing to maintain the 60fps locked I need in my Games

I am sure my whole PC would draw much less than 200W overall doing so, and its mega mega quiet even with 5Fans in it (I am using a CPU AIO) despite my case being flagged as a bad airflow/noisy one (MasterCooler Q300L) which is absolutely not true at all by the way
GPU barely reach 50degC (which is maybe 2-3degC more than with my Hybrids 1080Ti and VEGA LC) and CPU around 45degC at its highest

Just Amazing!
So Amazing that I am still at the PS3 Console Generation sitting in the living room and its taking dust lol)


----------



## Felix123BU (Nov 13, 2021)

nolive721 said:


> exactly my thoughts and drivers here. The only scenario where the GPU is at default, not even a slight overclock, is when I switch Gaming to my triples 1080p 75Hz rig that is driven by the same PC in the room
> 
> but for whatever reasons it happens less and less these days
> 
> ...


Totally agree, there is a mental thing going on in most of us hardware enthusiasts to push thing to their absolute max, but with some small exceptions, that is just a waste of electrical energy 
I pushed my whole system to the absolute limit, it was lots of fun, especially the 6800XT, I was and am still amazed of how much this card can push, but realistically for 95% of games I play, I barely need half of its power, hence the low power tuning. Good to know that if games come in the future that need to use it fully, there is a reserve there, and with how the GPU market is going, my card will live a long life in my system.

And yeah, one major change with this low power setup is the absolute quiet that it brings, I tuned all my fans down, pump down, simply because at ~200w (CPU+GPU) the heat produced is so low that even with fans at 25% and lower it still stays super cool, CPU barely reaches 55c, and for a 5800x that is notoriously hot, that's very cool, and the 6800xt almost never touches 40c.


----------



## nolive721 (Nov 13, 2021)

Just out of curiosity what core voltage did you set your 2.2ghz clock at?


----------



## Felix123BU (Nov 14, 2021)

nolive721 said:


> Just out of curiosity what core voltage did you set your 2.2ghz clock at?


1020mv, any lower, as with the 2200mhz core clock, and it would crash the driver. I know it could go lower, I suspect its a driver limitation or bug, since it can run 2700mhz at 1040mv 24/7.


----------



## nolive721 (Nov 14, 2021)

thanks, that's helpful

this is where mine sits and I also believe it could go a bit lower. will do some more testing soon when time permits


----------



## Felix123BU (Nov 14, 2021)

This is mine, one difference vs yours is that I set a 1800mhz min frequency because I noticed that in some less graphically intensive games, mainly older ones, a 500mhz frequency can cause very erratic framerates, as in the GPU tends to fluctuate a lot between 500 and 2200mhz in some old games where load is very low, and power draw does not differ much anyway if set at 1800 vs 500mhz for min limit.


----------



## nolive721 (Nov 14, 2021)

Fair point here
I play modern titles with GPU load always above 80%!so I didn’t feel the need to set min core

but I will definitely revisit the core voltage

curious how you ended up with -6% on power slider?
Any real life impact from your experience?

thanks a lot


----------



## sepheronx (Nov 14, 2021)

mechtech said:


> C
> 
> Can you not just return it and get your money back?   What make model you get?


Its an Asus Dual.

No, all sales are final when it comes to GPU's at memoryexpress.


----------



## Felix123BU (Nov 14, 2021)

nolive721 said:


> Fair point here
> I play modern titles with GPU load always above 80%!so I didn’t feel the need to set min core
> 
> but I will definitely revisit the core voltage
> ...


-6% on power slider because the normal PPT (package power) limit is 255w, so with -6% its 239w max, it its still double that what the GPU uses mostly with these settings, and it is the lowest I can go.
As for  real life impact for the -6%, no, as said, its just the upper wattage limit the GPU can use, but with 2200mhz max and 1020mv it uses between 80w and 180w, still under the PPT limit.
I just got into Skyrim, and that will everything turned up, mods and such, uses 80-100w. In Cyberpunk 2077 it goes to 150-180w, all at 3440x1440 resolution and 60hz locked.
I would put it lower, but -6% is the lowest the drivers allows.


----------



## nolive721 (Nov 14, 2021)

Ok thanks Sir makes sense


----------



## mechtech (Nov 14, 2021)

sepheronx said:


> Its an Asus Dual.
> 
> No, all sales are final when it comes to GPU's at memoryexpress.


Ya I remember seeing that, that's kinda BS as it's supposed to be 14 day return policy on electronics.


----------



## GamerGuy (Nov 15, 2021)

sepheronx said:


> Its an Asus Dual.
> 
> No, all sales are final when it comes to GPU's at memoryexpress.


Is that a card for your second system? I see a 6800XT in your sig.....


----------



## sepheronx (Nov 15, 2021)

GamerGuy said:


> Is that a card for your second system? I see a 6800XT in your sig.....


Not quite. I use my 6800xt for other stuff in my NAS/workstation/server etc.

Now that is a good card. I love it. I also let my kids use the rig to play games.  I don't play a lot of games anymore, I mostly play retro games (emulation) because most modern games are just copies of each other and boring as all heck.  Hopefully that changes.  The 6600 was actually a purchase for my main rig as a cheap light gamer and to run Linux.  But the price of the card was the part that bothered me the most.


----------



## Tradition (Nov 15, 2021)

Got my self a 6900xt LC edition and found out that the original bios with 2310mhz memory clock has horrible timings so my points on 3dm were awful
i flashed with asus LC edition and go these results
i limite the card to 355A and 402w 
it stays arround 2750mhz during the test
if i increse the power draw i can go beyond but i dont really wanna break do card








						I scored 19 186 in Time Spy
					

AMD Ryzen 5 5600X, AMD Radeon RX 6900 XT x 1, 16384 MB, 64-bit Windows 11}




					www.3dmark.com


----------



## lunatik (Nov 15, 2021)

Are there any simple (not undervolt) ways to lower junction temp on 6700xt(sapphire nitro)? Problem being 55° on normal temp but hitting 100° on junction and crashes..seems like an issue with thermal pads or smth? Never really opened a gpu before. Or it's just normal with air cooled gpu?


----------



## GamerGuy (Nov 16, 2021)

lunatik said:


> Are there any simple (not undervolt) ways to lower junction temp on 6700xt(sapphire nitro)? Problem being 55° on normal temp but hitting 100° on junction and crashes..seems like an issue with thermal pads or smth? Never really opened a gpu before. Or it's just normal with air cooled gpu?


How's case airflow for your rig? Also, have you tried adjusting the fan curve via Adrenalin CP? I have the Nitro+ RX 6900 XT and don't have an issue with heat with my fans ramped up (only if you don't mind the fan noise, which isn't that bad for me as my rig's on the floor)


----------



## Butanding1987 (Nov 16, 2021)

Tradition said:


> Got my self a 6900xt LC edition and found out that the original bios with 2310mhz memory clock has horrible timings so my points on 3dm were awful
> i flashed with asus LC edition and go these results
> i limite the card to 355A and 402w
> it stays arround 2750mhz during the test
> ...


It’s pointless to buy an expensive high-end card like that if you‘re not brave enough to push it. My 6800 XT is not far behind your 6900 XT.


----------



## Valantar (Nov 16, 2021)

lunatik said:


> Are there any simple (not undervolt) ways to lower junction temp on 6700xt(sapphire nitro)? Problem being 55° on normal temp but hitting 100° on junction and crashes..seems like an issue with thermal pads or smth? Never really opened a gpu before. Or it's just normal with air cooled gpu?


That's definitely not normal. The TPU review showed <20°C between the edge and hotspot temperatures. Either it's poorly mounted, has poor a poor thermal paste application, or has some sort of production flaw (a non-flat cooler surface, for example). My first suggestion would be to RMA it, though given the current situation that might leave you without a GPU for a while. After that, open it and check for thermal paste coverage and contact (i.e. how thinly squeezed the paste is across the die). Mounting pressure is often an issue with AMD cards for some reason, and getting some thin plastic washers and putting them between the screws on the back of the PCB (the four around the die specifically) and the PCB can alleviate this - but then you also need to be careful, as too high pressure can damage the card. The safest route is an RMA.


----------



## lunatik (Nov 16, 2021)

GamerGuy said:


> How's case airflow for your rig? Also, have you tried adjusting the fan curve via Adrenalin CP? I have the Nitro+ RX 6900 XT and don't have an issue with heat with my fans ramped up (only if you don't mind the fan noise, which isn't that bad for me as my rig's on the floor)


Phanteks p500a, 3x 140mm front intake, 1x 140mm rear exhaust, 1x 140mm top exhaust and 1x140mm fan over ram. Gpu runs at 2780-2880mhz, 1110mv set in adrenalin (this is with junction temp 95-98 under full load). Just looking for ways to push it further, since lowering voltage would make it crash or using higher voltage would mean junction temp getting too high.


----------



## ratirt (Nov 16, 2021)

lunatik said:


> Phanteks p500a, 3x 140mm front intake, 1x 140mm rear exhaust, 1x 140mm top exhaust and 1x140mm fan over ram. Gpu runs at 2780-2880mhz, 1110mv set in adrenalin (this is with junction temp 95-98 under full load). Just looking for ways to push it further, since lowering voltage would make it crash or using higher voltage would mean junction temp getting too high.


the 140mm fan over ram may disrupt the proper airflow in your case. My 6900xt never hits 100c but i can double check after playing for a while what temp it's going to show.


----------



## Valantar (Nov 16, 2021)

ratirt said:


> the 140mm fan over ram may disrupt the proper airflow in your case. My 6900xt never hits 100c but i can double check after playing for a while what temp it's going to show.


Airflow issues will not cause a 45-degree delta between edge and hotspot temperatures. Airflow issues will affect all temperatures roughly equally. There is something wrong with that cooler or its mounting, period.


----------



## lunatik (Nov 16, 2021)

ratirt said:


> the 140mm fan over ram may disrupt the proper airflow in your case. My 6900xt never hits 100c but i can double check after playing for a while what temp it's going to show.


Yeah, might be..thought about it too while typing. Altough when i pushed this card to its current "max"  limits, i didnt have that fan over ram. There's also my 5600x, running on a be quiet tower, cant remember the name, 1 fan tho. I'll test if it has any impact on temp with or without later tonight.


----------



## Yraggul666 (Nov 16, 2021)

HELL->o! I finally joined the club.


----------



## lunatik (Nov 16, 2021)

Valantar said:


> Airflow issues will not cause a 45-degree delta between edge and hotspot temperatures. Airflow issues will affect all temperatures roughly equally. There is something wrong with that cooler or its mounting, period.


I'll also try to tighten the screws. Don't really care that much about warranty, i'm just bored out of my mind with another lockdown. And if needed, i'll order thermal paste and pads too.


----------



## Valantar (Nov 16, 2021)

lunatik said:


> I'll also try to tighten the screws. Don't really care that much about warranty, i'm just bored out of my mind with another lockdown. And if needed, i'll order thermal paste and pads too.


Most likely the screws can't be tightened sufficiently on their own, and overtightening can risk stripping or shearing the screws, in which case you're up that proverbial creek without a paddle. If you don't have any nylon washers handy, try cutting some yourself from some stiff plastic? If not, any decent hardware store should have nylon washers. 1mm thick should be plenty.


----------



## ratirt (Nov 16, 2021)

Valantar said:


> Airflow issues will not cause a 45-degree delta between edge and hotspot temperatures. Airflow issues will affect all temperatures roughly equally. There is something wrong with that cooler or its mounting, period.


Junction temps can go high on 6000 series. The 100c is quite extreme but you can't disregard all other options as a reason for that. He has this card overclocked so it is normal that from Idle to full load the temp will go up rapidly and high. If the hot air is not being removed with a certain pase due to disturbance in the airflow it may cause overheating over time and rise of temperature.
The difference between the core temp and the junction can reach these temp differences. Especially if the airflow is not OK. He has an air cooled card and this can happen. It is different with your case having a liquid cooled card which probably has a full pcb cover plate.
I'm not saying the mounting is all good but worth to try anyway the easy things and see if it improves things. I've got a Vega 64 and the difference between junction and core were 40 degrees sometimes.

He can try removing case side cover and see if the temps are as high. If it will not reach 100c as with the closed case, it is an airflow. Mounting might help as well but lets focus on the easy stuff before we start stripping card apart, which may not be the problem.


----------



## Valantar (Nov 16, 2021)

ratirt said:


> Junction temps can go high on 6000 series. The 100c is quite extreme but you can't disregard all other options as a reason for that. He has this card overclocked so it is normal that from Idle to full load the temp will go up rapidly and high. If the hot air is not being removed with a certain pase due to disturbance in the airflow it may cause overheating over time and rise of temperature.
> The difference between the core temp and the junction can reach these temp differences. Especially if the airflow is not OK. He has an air cooled card and this can happen. It is different with your case having a liquid cooled card which probably has a full pcb cover plate.
> I'm not saying the mounting is all good but worth to try anyway the easy things and see if it improves things. I've got a Vega 64 and the difference between junction and core were 40 degrees sometimes.
> 
> He can try removing case side cover and see if the temps are as high. If it will not reach 100c as with the closed case, it is an airflow. Mounting might help as well but lets focus on the easy stuff before we start stripping card apart, which may not be the problem.


No. TPUs review of that specific GPU, the 6700 XT Nitro+, which I linked above, showed a <20-degree delta. Different workloads can affect the deltas differently, but seeing significantly more than that is a clear indication that there is something wrong with the cooler. A near 50-degree delta is _not_ normal under any circumstance for any GPU, period. GPUs generally spread their heat generation across the majority of the die, which leads to relatively even thermals. A 20-degree delta is fine, 30 is kind of poor but acceptable, 50 is borderline dangerous.

For reference: my deltas with the LDU are not particularly good - depending on the load they often exceed 20 degrees, and I've seen it creep towards 30. This is typical of PowerColor, who tend to have questionable QC on their cooler mounting. As the absolute temperatures are low it ultimately doesn't matter, but with a delta as high as what is described here, you essentially have zero thermal headroom for the die. That is a bad situation to be in.

Besides, if airflow was the issue, there's no way the edge temperature would be 55°. The die is not large enough that airflow over different parts of the cooler will affect different areas of the die significantly differently. And besides, the heatpipes run across the length of the GPU, meaning that for your scenario of airflow affecting things to be true, airflow would need to be selectively _massively_ worse for a single heatpipe. That is, well, not possible, when you have three fans blowing directly across them. There's also a copper plate between the heatpipes and die. If airflow was poor, the cooler would be unable to dissipate its heat into the air, leading to the cooler warming up, in turn leading to the entire die warming up. So, there is no realistic scenario in which poor airflow will selectively make one part of the die drastically hotter than another. Poor contact or flatness is the only reasonable explanation, period.

Also, Vega was _notorious_ for poor cooler fitment due to the height differences between HBM and the GPU die. Using that as a reference point for anything is a bad idea.


----------



## ratirt (Nov 16, 2021)

Valantar said:


> No. TPUs review of that specific GPU, the 6700 XT Nitro+, which I linked above, showed a <20-degree delta. Different workloads can affect the deltas differently, but seeing significantly more than that is a clear indication that there is something wrong with the cooler. A near 50-degree delta is _not_ normal under any circumstance for any GPU, period. GPUs generally spread their heat generation across the majority of the die, which leads to relatively even thermals. A 20-degree delta is fine, 30 is kind of poor but acceptable, 50 is borderline dangerous.
> 
> For reference: my deltas with the LDU are not particularly good - depending on the load they often exceed 20 degrees, and I've seen it creep towards 30. This is typical of PowerColor, who tend to have questionable QC on their cooler mounting. As the absolute temperatures are low it ultimately doesn't matter, but with a delta as high as what is described here, you essentially have zero thermal headroom for the die. That is a bad situation to be in.
> 
> Besides, if airflow was the issue, there's no way the edge temperature would be 55°. The die is not large enough that airflow over different parts of the cooler will affect different areas of the die significantly differently. And besides, the heatpipes run across the length of the GPU, meaning that for your scenario of airflow affecting things to be true, airflow would need to be selectively _massively_ worse for a single heatpipe. That is, well, not possible, when you have three fans blowing directly across them. There's also a copper plate between the heatpipes and die. If airflow was poor, the cooler would be unable to dissipate its heat into the air, leading to the cooler warming up, in turn leading to the entire die warming up. So, there is no realistic scenario in which poor airflow will selectively make one part of the die drastically hotter than another. Poor contact or flatness is the only reasonable explanation, period.


Each computer configuration is different and you don't know what's his. Worth to check all but simple thing goes first.
He can open his case and check.


Valantar said:


> Also, Vega was _notorious_ for poor cooler fitment due to the height differences between HBM and the GPU die. Using that as a reference point for anything is a bad idea.


Vega was just an example. I have a 6900XT Air cooled and I know what happened when I've had a bad airflow and what happens when i open my case.


----------



## Valantar (Nov 16, 2021)

ratirt said:


> Each computer configuration is different and you don't know what's his. Worth to check all but simple thing goes first.
> He can open his case and check.
> 
> Vega was just an example. I have a 6900XT Air cooled and I know what happened when I've had a bad airflow and what happens when i open my case.


Yes, the temperature of the entire GPU changes. That's what airflow does. If changing airflow dramatically changes the delta between edge and hotspot temperatures for you, there is something very wrong with your cooler as well.

But, just to check, can you open HWInfo64, double click the GPU temperature and GPU Hotspot temperature readings to open the graph view for both, run a load on that 6900XT for a few minutes until it stabilizes, and then open your case and let it run a few minutes more, then screenshot the graphs so we can see the change? You're making me curious here.


----------



## ratirt (Nov 16, 2021)

Valantar said:


> Yes, the temperature of the entire GPU changes. That's what airflow does. If changing airflow dramatically changes the delta between edge and hotspot temperatures for you, there is something very wrong with your cooler as well.


He's got a fan blowing on ram. this may screw up the airflow. Let him try things.


Valantar said:


> But, just to check, can you open HWInfo64, double click the GPU temperature and GPU Hotspot temperature readings to open the graph view for both, run a load on that 6900XT for a few minutes until it stabilizes, and then open your case and let it run a few minutes more, then screenshot the graphs so we can see the change? You're making me curious here.


I got a good airflow after upgrading so there's no need for that. My GPU doesn't hit 100c junction. But it did get hot some time back when airflow was bad.


----------



## Valantar (Nov 16, 2021)

ratirt said:


> He's got a fan blowing on ram. this may screw up the airflow. Let him try things.


A fan blowing on the RAM will either change the airflow patterns for a large part of the GPU, or not at all. Either way, it will not cause high hotspot temperatures while maintaining low edge temperatures. This is simply not possible due to how air coolers work.


ratirt said:


> I got a good airflow after upgrading so there's no need for that. My GPU doesn't hit 100c junction. But it did get hot some time back when airflow was bad.


I'm not asking you to test it for your own sake, I'm asking you to test it to corroborate your own statements. If you can't do that, then I have no other choice than to veer on the side of logic assume that you've either misremembered or otherwise misidentified the dynamics of your cooler, because there is to my knowledge no way a change in airflow can affect hotspot temperatures significantly more than edge temperatures. So, unless you can actually show some data indicating otherwise, I'll keep assuming that those two temperature readings will change roughly in sync as the environment around them changes. Now, I know the delta is generally smaller at lower loads - my 6900 XT has much higher deltas if I run it at full power than in my ~190W UV profile (~10-15 degrees with UV, mid-20s without), but that's with a relatively poorly mounted cooler from a manufacturer known for this. And, of course, the <20-degree deltas in the TPU review are under full load. Even if some other load triggers warmer hotspots than what they saw, anything above 30 would be deeply worrying in light of that. And that's the point here: _some_ change in the delta across a range of loads is normal. A 45-degree delta is not normal for any card with a well-functioning cooler, period, regardless of environmental factors.


----------



## Felix123BU (Nov 16, 2021)

If you have a 40-50c difference between gpu temp and hotspot, that is 99% a case of bad mount and uneven pressure. 
On another point, a 100c on the hotspot for a OC-ed card on air is really not that bad. I wonder if the difference between the 2 temps is correct, in cases of bad mounts and parts of the GPU not being cooled properly, you would get a lot more than 100c on the hotspot.


----------



## ratirt (Nov 16, 2021)

Valantar said:


> I'm not asking you to test it for your own sake, I'm asking you to test it to corroborate your own statements. If you can't do that, then I have no other choice than to veer on the side of logic assume that you've either misremembered or otherwise misidentified the dynamics of your cooler, because there is to my knowledge no way a change in airflow can affect hotspot temperatures significantly more than edge temperatures. So, unless you can actually show some data indicating otherwise, I'll keep assuming that those two temperature readings will change roughly in sync as the environment around them changes. Now, I know the delta is generally smaller at lower loads - my 6900 XT has much higher deltas if I run it at full power than in my ~190W UV profile (~10-15 degrees with UV, mid-20s without), but that's with a relatively poorly mounted cooler from a manufacturer known for this. And, of course, the <20-degree deltas in the TPU review are under full load. Even if some other load triggers warmer hotspots than what they saw, anything above 30 would be deeply worrying in light of that. And that's the point here: _some_ change in the delta across a range of loads is normal. A 45-degree delta is not normal for any card with a well-functioning cooler, period, regardless of environmental factors


I had that problem with airflow. I'm not testing anything but rather speaking from experience. I had a problem with airflow which caused the temps hot spot go wild and that was my advice for one of the members. Check the airflow and see if you can adjust something since maybe something is wrong. Maybe heat-sink needs adjustments but it doesn't have to be that either. I dont know what set up he has exactly so speaking it is this or that will not help him cause there's so many different things that can affect the temp.
Dude, I dont want to argue with you about this nonsense. Is there really a need for me to prove that proper airflow in the case matters? Airflow in the case matters and you will not convince me otherwise. I'm sure there are people who will tell you the same thing.
What I can tell you is, when I have upgraded my fan system, the heat coming out of the case was tremendous. Especially if you run your card at full speed for a while.


----------



## lunatik (Nov 16, 2021)

So, i did a fresh windows install and also re-installed all the drivers. Put everything in stock/default.
Gpu temp 63, hot spot 97 (front and top silent cover removed, also the ram fan)
Gpu temp 62, hot spot 96 (+ the side panels removed)

Edit: or maybe my testing methods are completely false for this? This was done in time spy stress test 20 loops..


----------



## Valantar (Nov 16, 2021)

ratirt said:


> Dude, I dont want to argue with you about this nonsense. Is there really a need for me to prove that proper airflow in the case matters? Airflow in the case matters and you will not convince me otherwise. I'm sure there are people who will tell you the same thing.


I never said anything even remotely to this effect. I said poor airflow doesn't cause hot spot temps specifically to rise dramatically while edge temps are normal, as this is simply not how physics and air coolers work. The absolute worst case scenario for this is if the fins around one of the heatpipes got no airflow while the rest did, which would cause the die to selectively heat up, but, again, this isn't possible with how heatsinks are laid out. If all heatpipes get roughly equal airflow - which they will due to the fans mounted to the heatsink - then all heatpipes will dissipate roughly equal amounts of heat, and all areas of the die will get roughly the same cooling, barring unevenness in the mount, flatness issues in the cold plate, production errors causing poor heatpipe-to-cold plate contact, or other such issues. If one or more of the GPU fans gets too little cool air, is recycling hot air, or is obstructed to such a degree that it underperforms, then this affects all heatpipes roughly equally. The scenario you are describing does not make sense. That is why I'm asking you to provide some data. Not because I'm somehow denying that case airflow matters.

It's no wonder that a 330W+ GPU needs a lot of ventilation. What would be a wonder would be if adding more cooling affected hotspot temperatures significantly more than overall GPU temperatures.


lunatik said:


> So, i did a fresh windows install and also re-installed all the drivers. Put everything in stock/default.
> Gpu temp 63, hot spot 97 (front and top silent cover removed, also the ram fan)
> Gpu temp 62, hot spot 96 (+ the side panels removed)
> 
> Edit: or maybe my testing methods are completely false for this? This was done in time spy stress test 20 loops..


Unless you're controlling fan speeds you can't really compare the two, as the side panel removed test might have the GPU running at lower rpms. I'd still say >30 degrees indicates a poor mount or something similar, as that's an unusually high thermal delta. You also kind of need to monitor clocks, as GPUs these days adjust those very actively.


----------



## lunatik (Nov 16, 2021)

Edit: discard all that, adrenalin just bugged out and didnt actually apply any settings global or warzone

Edit: i remembered that i had some thermal paste left from my previous computer and changed it on both cpu and gpu. From the same default/stock settings and time spy loop my gpu temp increased from 63 to 65 but hot spot decreased from 97 to 87.

Thank you all for quick replies and help


----------



## ratirt (Nov 17, 2021)

lunatik said:


> Edit: discard all that, adrenalin just bugged out and didnt actually apply any settings global or warzone
> 
> Edit: i remembered that i had some thermal paste left from my previous computer and changed it on both cpu and gpu. From the same default/stock settings and time spy loop my gpu temp increased from 63 to 65 but hot spot decreased from 97 to 87.
> 
> Thank you all for quick replies and help


Gratz dude you got the thing sorted out  Happy playing.


----------



## Valantar (Nov 17, 2021)

lunatik said:


> Edit: i remembered that i had some thermal paste left from my previous computer and changed it on both cpu and gpu. From the same default/stock settings and time spy loop my gpu temp increased from 63 to 65 but hot spot decreased from 97 to 87.
> 
> Thank you all for quick replies and help


That's great! Good to hear it was a relatively easy fix.


----------



## nolive721 (Nov 21, 2021)

Felix123BU said:


> This is mine, one difference vs yours is that I set a 1800mhz min frequency because I noticed that in some less graphically intensive games, mainly older ones, a 500mhz frequency can cause very erratic framerates, as in the GPU tends to fluctuate a lot between 500 and 2200mhz in some old games where load is very low, and power draw does not differ much anyway if set at 1800 vs 500mhz for min limit.
> 
> View attachment 225054


thanks again because your post pushed me to revisit my UV and Oh boy its been rewarding in the last few days of testing

I have managed to decrease my Vcore to 1010mV and still be stable in Heavy Gaming at 2200Mhz core.
Power drains reduced to now 110W in worst case conditions (Simracing titles like PC2 or AC with heavy rain and again Graphics at Ultra), its just an mazing card

Now I am being offered the equivalent of $1000 for it (around $300 what I had paid for 2months ago) so I need to resist to the temptation


----------



## Fatal Fighter (Dec 4, 2021)

Hey guys
Way late for party, but here is my RX6900XT, hopefully I may join this awesome club!


----------



## Felix123BU (Dec 4, 2021)

Fatal Fighter said:


> Hey guys
> Way late for party, but here is my RX6900XT, hopefully I may join this awesome club!


Ownership of a RX6900XT grants automatic admission   
Kidding of course, anyone can join 
Enjoy the card!


----------



## Fatal Fighter (Dec 4, 2021)

Felix123BU said:


> Ownership of a RX6900XT grants automatic admission
> Kidding of course, anyone can join
> Enjoy the card!


Thank you man!

Been owner of this card for about two weeks. At first it almost gave me a heart attack  
I bought this one brand new but from some guy on the local forum. I ran Time Spy during which PC shutdown and happened with all benchmarks.
Turns out 850W Gold PSU is not enough due to power spikes, swapped with 1000W Platinum by be quiet! and issue was fixed.
Waiting for some cold nights to arrange some overclocking sessions  So far 24k GPU score in Time Spy

Been going through this topic to find some tips on MPT usage for this model but while doing so, decided to post here as well.


----------



## Valantar (Dec 4, 2021)

Fatal Fighter said:


> Thank you man!
> 
> Been owner of this card for about two weeks. At first it almost gave me a heart attack
> I bought this one brand new but from some guy on the local forum. I ran Time Spy during which PC shutdown and happened with all benchmarks.
> ...


Wow, what PSU was that? My 6900 XT Liquid Devil Ultimate has been 100% stable on my Corsair SF750 - and that's an SFX PSU, so it has smaller bulk caps than most ATX PSUs and should thus be at least somewhat worse at handling transient load spikes. Though I've heard of a lot of odd behaviour with newer GPUs from both brands.


----------



## Fatal Fighter (Dec 5, 2021)

Valantar said:


> Wow, what PSU was that? My 6900 XT Liquid Devil Ultimate has been 100% stable on my Corsair SF750 - and that's an SFX PSU, so it has smaller bulk caps than most ATX PSUs and should thus be at least somewhat worse at handling transient load spikes. Though I've heard of a lot of odd behaviour with newer GPUs from both brands.


I would have never thought that EVGA 850GA Gold was not enough, but that was it, as I saw TPU's review where was mentioned aboute power spikes up to 600W+ with 6900XT.
Your 6900XT was paired with Ryzen 5 or 7, right?


----------



## Valantar (Dec 5, 2021)

Fatal Fighter said:


> I would have never thought that EVGA 850GA Gold was not enough, but that was it, as I saw TPU's review where was mentioned aboute power spikes up to 600W+ with 6900XT.
> Your 6900XT was paired with Ryzen 5 or 7, right?


A 5800X, yes. What cpu are you using?


----------



## Fatal Fighter (Dec 5, 2021)

Valantar said:


> A 5800X, yes. What cpu are you using?


5950X
There is some chart, idk how accurate is it overall but definitely was accurate for my case.
Gigabyte-s website recommends 900W PSU as well, but still I was not believing that for a very long time
Someone posted this on reddit:


----------



## Valantar (Dec 5, 2021)

Fatal Fighter said:


> 5950X
> There is some chart, idk how accurate is it overall but definitely was accurate for my case.
> Gigabyte-s website recommends 900W PSU as well, but still I was not believing that for a very long time
> Someone posted this on reddit:


Yeah, I've seen that before. I don't think the CPU has much of an effect though, at least on the AMD side - 5800X, 5900X and 5950X have the exact same power limits after all, and the higher end chips are typically better binned, meaning they consume less power per core and thread under the same load (except in very low threaded workloads where the 5900X and 5950X consume 2-3W more per core due to their higher clocks) . And CPUs don't cause that many load spikes. One difference might be that the faster CPU is letting the GPU run more free, causing it to spike more dramatically, though I wouldn't expect that to be true in all workloads. I'm quite surprised by this, tbh.


----------



## Felix123BU (Dec 5, 2021)

Fatal Fighter said:


> 5950X
> There is some chart, idk how accurate is it overall but definitely was accurate for my case.
> Gigabyte-s website recommends 900W PSU as well, but still I was not believing that for a very long time
> Someone posted this on reddit:


That can be true in an absolute sense, as in, however crappy a PSU is, if it can push that amount, its "safe", and even then not always correct.
Its not true in the sense that people always say "buy a quality PSU", if one has a good PSU, you deffo wont need a 100W to drive  a 6900XT (350w max)  plus a 5950x (150w max).
But some people either don't know or forget that a PSU degrades over time, you might have had a 750w that worked fine 3 years ago when you bought it, but today that will not be the same anymore. And there are soooo many cases where a PSU advertises lets say 750w, but can not push more than 350w sustained without problems, you would be amazed how many of those are on the market, including some known brands, though the difference is not that extreme vs crappy cheap PSU's.

Glad you found a solution to your problem, in this day and age buying a top tier GPU, then seeing it not work properly must be a very nasty experience.


----------



## Fatal Fighter (Dec 5, 2021)

Felix123BU said:


> That can be true in an absolute sense, as in, however crappy a PSU is, if it can push that amount, its "safe", and even then not always correct.
> Its not true in the sense that people always say "buy a quality PSU", if one has a good PSU, you deffo wont need a 100W to drive  a 6900XT (350w max)  plus a 5950x (150w max).
> But some people either don't know or forget that a PSU degrades over time, you might have had a 750w that worked fine 3 years ago when you bought it, but today that will not be the same anymore. And there are soooo many cases where a PSU advertises lets say 750w, but can not push more than 350w sustained without problems, you would be amazed how many of those are on the market, including some known brands, though the difference is not that extreme vs crappy cheap PSU's.
> 
> Glad you found a solution to your problem, in this day and age buying a top tier GPU, then seeing it not work properly must be a very nasty experience.


I still cant believe that EVGA 850GA Gold was not enough or had idk some kind of incompatability? lol. It was a brand new PSU and RX6800 was running without any issues at all 
I guess it could not handle spikes, but on the other hand I have seen builds with 750W PSUs with no issues whatsoever

Thank you for kind words!


----------



## GamerGuy (Dec 5, 2021)

@Fatal Fighter
'Grats on the purchase, and welcome to the club! Good to see more and more users/owners of AMD cards!


----------



## Fatal Fighter (Dec 5, 2021)

GamerGuy said:


> @Fatal Fighter
> 'Grats on the purchase, and welcome to the club! Good to see more and more users/owners of AMD cards!


 Thanks mate! I feel like I am going to stick around


----------



## Butanding1987 (Dec 5, 2021)

Fatal Fighter said:


> Thanks mate! I feel like I am going to stick around


Now, let's start overclocking that baby.


----------



## Fatal Fighter (Dec 5, 2021)

Butanding1987 said:


> Now, let's start overclocking that baby.


Oh sure thing  Waiti for the weather to get colder


----------



## ratirt (Dec 6, 2021)

Fatal Fighter said:


> I still cant believe that EVGA 850GA Gold was not enough or had idk some kind of incompatability? lol. It was a brand new PSU and RX6800 was running without any issues at all
> I guess it could not handle spikes, but on the other hand I have seen builds with 750W PSUs with no issues whatsoever
> 
> Thank you for kind words!


Are you serious? I got a red devil 6900xt with a Seasonic 750W and it is rock solid and stable with the card. No issues whatsoever. 
Grats on the card   and how much did you throw for the card if you can say? Just want to compare to what I had to throw at it in Feb.


----------



## Fatal Fighter (Dec 6, 2021)

ratirt said:


> Are you serious? I got a red devil 6900xt with a Seasonic 750W and it is rock solid and stable with the card. No issues whatsoever.
> Grats on the card   and how much did you throw for the card if you can say? Just want to compare to what I had to throw at it in Feb.


Trust me, I am as surprised as you are and thats why the main suspect was the GPU itself for along time, untill I plugged another PSU just for the card and it worked even OC'd  
I sold my RX6800 roughly for  $1400 and bought this one for ~$2000


----------



## mama (Dec 6, 2021)

Valantar said:


> Wow, what PSU was that? My 6900 XT Liquid Devil Ultimate has been 100% stable on my Corsair SF750 - and that's an SFX PSU, so it has smaller bulk caps than most ATX PSUs and should thus be at least somewhat worse at handling transient load spikes. Though I've heard of a lot of odd behaviour with newer GPUs from both brands.


Mine is happy with 850W.


----------



## ratirt (Dec 6, 2021)

Fatal Fighter said:


> Trust me, I am as surprised as you are and thats why the main suspect was the GPU itself for along time, untill I plugged another PSU just for the card and it worked even OC'd
> I sold my RX6800 roughly for  $1400 and bought this one for ~$2000


I had similar problem with my Vega64 with the Corsair AXi 760W. Bought the 5600xt (vega got damaged) and also had a problem with the new card. Switched the PSU to seasonic and all problems went away.


----------



## mama (Dec 6, 2021)

Valantar said:


> Yeah, I've seen that before. I don't think the CPU has much of an effect though, at least on the AMD side - 5800X, 5900X and 5950X have the exact same power limits after all, and the higher end chips are typically better binned, meaning they consume less power per core and thread under the same load (except in very low threaded workloads where the 5900X and 5950X consume 2-3W more per core due to their higher clocks) . And CPUs don't cause that many load spikes. One difference might be that the faster CPU is letting the GPU run more free, causing it to spike more dramatically, though I wouldn't expect that to be true in all workloads. I'm quite surprised by this, tbh.


And I thought the 6900xt had the same 300W power profile as the 6800XT...


----------



## lowrider_05 (Dec 6, 2021)

mama said:


> And I thought the 6900xt had the same 300W power profile as the 6800XT...


For the AMD reference design yes, this is correct but OEM´s can and will usually set higher power limits then the reference design.


----------



## Valantar (Dec 6, 2021)

lowrider_05 said:


> For the AMD reference design yes, this is correct but OEM´s can and will usually set higher power limits then the reference design.


This, plus more CUs adjusting their clocks on the fly = chance of higher transients.


----------



## Fatal Fighter (Dec 6, 2021)

ratirt said:


> I had similar problem with my Vega64 with the Corsair AXi 760W. Bought the 5600xt (vega got damaged) and also had a problem with the new card. Switched the PSU to seasonic and all problems went away.


I was thinking maybe its faulty PSU but damn it, OC'd RX6800 and 5950X had no issues at all. IDK if there is such thing as PSU incompatability, but the only logical answer was power spikes, which I found about here actually


http://imgur.com/BrGtiyl


----------



## Butanding1987 (Dec 8, 2021)

I started having problems with my 850w "80 Plus Gold" InWin PSU in less than a year. Suddenly, my PC would reboot once the GPU clock hit 2,600. I am running an overclocked RX 6800 XT. I have since replaced it with a 1300w Platinum Seasonic and the problems went away. Now I can overclock this thing to death again. 

First time to hit first place in Port Royal and Fire Strike for the 6800 XT. I have a feeling it won't be long, at least for Port Royal.








						I scored 11 556 in Port Royal
					

AMD Ryzen 9 5950X, AMD Radeon RX 6800 XT x 1, 32768 MB, 64-bit Windows 11}




					www.3dmark.com
				












						I scored 49 611 in Fire Strike
					

AMD Ryzen 9 5950X, AMD Radeon RX 6800 XT x 1, 32768 MB, 64-bit Windows 11}




					www.3dmark.com
				













I have been out of the Hall of Fame Top 10 since the 12900k owners started breaking records. Now, I need 401 points to hit the Hall of Fame Top 10 again. 








						3DMark Fire Strike Hall of Fame
					

The 3DMark.com Overclocking Hall of Fame is the official home of 3DMark world record scores.




					www.3dmark.com


----------



## Fatal Fighter (Dec 8, 2021)

Great CPU score! I saw buildzoid getting same, but he was doing it like 4825/4775mhz per CCDs. Are you using all core OC too or?
My 5950x unable to run 1900/2000FLCK and it kinda sucks


----------



## Felix123BU (Dec 8, 2021)

Fatal Fighter said:


> Great CPU score! I saw buildzoid getting same, but he was doing it like 4825/4775mhz per CCDs. Are you using all core OC too or?
> My 5950x unable to run 1900/2000FLCK and it kinda sucks


My 5800x runs 1900FLCK 24/7, but 2000 no way, at least not with the last bios, though each bios allowed me do go a bit higher, hope the next one gets me close to 2000, the Ram sticks can take it.
In Fire Strike getting the Ram as fast all possible gets higher scores incrementally with my 6800xt. At least for Fire Strike, Ram speed helped me more than Gpu OC.


----------



## Fatal Fighter (Dec 8, 2021)

Felix123BU said:


> My 5800x runs 1900FLCK 24/7, but 2000 no way, at least not with the last bios, though each bios allowed me do go a bit higher, hope the next one gets me close to 2000, the Ram sticks can take it.
> In Fire Strike getting the Ram as fast all possible gets higher scores incrementally with my 6800xt. At least for Fire Strike, Ram speed helped me more than Gpu OC.


My 5800x was running 2000FLCK no problems, idk 2000 feels so satisfying  not sure how much of a difference it makes for benchmarks


----------



## Butanding1987 (Dec 9, 2021)

I'm running my 32GB dual channel RAM at 3733Mhz under an ambient temperature of ~33C. The PC won't POST at 3800Mhz unless the air-con is on, but POSTs at higher FCLK (3833, 3866 and 4k) and with WHEAs even if the ambient temp is high. For benchmarks, I always turn the AC on. I use Hydra to overclock the 5950x. The settings are different for each 3dmark benchmark. Time Spy and Time Spy Extreme are the most demanding.



Fatal Fighter said:


> My 5800x was running 2000FLCK no problems, idk 2000 feels so satisfying  not sure how much of a difference it makes for benchmarks


Make sure you run benchmarks to check if your points have actually gone up. In my case, the sweet spot is 1900 FCLK. Anything higher than that leads to lower 3dmark/Cinebench points.

I made it.   








						3DMark Fire Strike Hall of Fame
					

The 3DMark.com Overclocking Hall of Fame is the official home of 3DMark world record scores.




					www.3dmark.com


----------



## Fatal Fighter (Dec 9, 2021)

Butanding1987 said:


> I'm running my 32GB dual channel RAM at 3733Mhz under an ambient temperature of ~33C. The PC won't POST at 3800Mhz unless the air-con is on, but POSTs at higher FCLK (3833, 3866 and 4k) and with WHEAs even if the ambient temp is high. For benchmarks, I always turn the AC on. I use Hydra to overclock the 5950x. The settings are different for each 3dmark benchmark. Time Spy and Time Spy Extreme are the most demanding.
> 
> 
> Make sure you run benchmarks to check if your points have actually gone up. In my case, the sweet spot is 1900 FCLK. Anything higher than that leads to lower 3dmark/Cinebench points.
> ...


My 5800x was doing great with FLCK 2000, 5900x was laggy as hell and scores were almost half. My 5950x’s sweet spot is 1800 though, it doing kinda fine but 1900/2000 would be cool.
 Gratz on top9! Hopefully join you in top10 soon


----------



## Marc88 (Dec 9, 2021)

I need some advice on choosing, they offered me 2 models of 6900xt with waterblock and, the models are the following:

PowerColor Liquid Devil AMD Radeon RX 6900 XT Ultimate

Gigabyte AORUS Radeon RX 6900 XT XTREME WATERFORCE WB

Thanks


----------



## Fatal Fighter (Dec 9, 2021)

Marc88 said:


> I need some advice on choosing, they offered me 2 models of 6900xt with waterblock and, the models are the following:
> 
> PowerColor Liquid Devil AMD Radeon RX 6900 XT Ultimate
> 
> ...


Owner of the latter here. Great card and I cant really advise you to choose one over another, both should be kinda the same overall, its about looks now. Maybe if there is clearance issue go with Powercolor, it should be shorter, gigabyte is 28,5cm long


----------



## Valantar (Dec 9, 2021)

Marc88 said:


> I need some advice on choosing, they offered me 2 models of 6900xt with waterblock and, the models are the following:
> 
> PowerColor Liquid Devil AMD Radeon RX 6900 XT Ultimate
> 
> ...





Fatal Fighter said:


> Owner of the latter here. Great card and I cant really advise you to choose one over another, both should be kinda the same overall, its about looks now. Maybe if there is clearance issue go with Powercolor, it should be shorter, gigabyte is 28,5cm long


Heh, I was about to write the same, only I have the former. Haven't had the privilege of making a comparison, so ... I would assume both are good. I've heard people say PC cards often have poor cooler mounting pressure and thus high deltas between edge and hotspot temperatures - but I don't know if that is limited to air or not. Mine maxes out at a ~20-degree delta, and that's only when I let it run _hot_. That's not _great_ for a water cooled card, but definitely nowhere near problematic in any way, nor anything that's likely to affect clocks, longevity, or anything like that.


----------



## Marc88 (Dec 9, 2021)

Fatal Fighter said:


> Owner of the latter here. Great card and I cant really advise you to choose one over another, both should be kinda the same overall, its about looks now. Maybe if there is clearance issue go with Powercolor, it should be shorter, gigabyte is 28,5cm long





Valantar said:


> Heh, I was about to write the same, only I have the former. Haven't had the privilege of making a comparison, so ... I would assume both are good. I've heard people say PC cards often have poor cooler mounting pressure and thus high deltas between edge and hotspot temperatures - but I don't know if that is limited to air or not. Mine maxes out at a ~20-degree delta, and that's only when I let it run _hot_. That's not _great_ for a water cooled card, but definitely nowhere near problematic in any way, nor anything that's likely to affect clocks, longevity, or anything like that.


thank you bro.
There is a lot of difference in performance between these models and the 6900 base, I'm also thinking about taking the base and taking a separate wb, what do you think?


----------



## Butanding1987 (Dec 10, 2021)

There's not much difference between these cards in terms of performance. It doesn't really matter if you're just gaming. But if you're after benchmark world records, then you want the best binned card.


----------



## Fatal Fighter (Dec 10, 2021)

Marc88 said:


> thank you bro.
> There is a lot of difference in performance between these models and the 6900 base, I'm also thinking about taking the base and taking a separate wb, what do you think?


What post above says. If you are after gaming only and would like to experience what installing a wb feels like, then sure, go for the second option.


----------



## Fatal Fighter (Dec 12, 2021)

I scored 23 487 in Time Spy
					

AMD Ryzen 9 5950X, AMD Radeon RX 6900 XT x 1, 32768 MB, 64-bit Windows 10}




					www.3dmark.com
				



Small OC session ive done today
Score seems low idk, maybe I should apply 1.3V+ on gpu? kinda scary


----------



## lunatik (Dec 15, 2021)

Anyone have any experience with grizzly conductonaut (liquid metal) as thermal paste?

After one month of using arctic mx4 paste on my 6700xt, it pretty much stopped working for me few days ago..

I'm back to 30-40c delta between normal and junction temp (was 20-24c when used mx4) with my "warzone oc" of 2774-2874mhz min/max 1140v,  2150 mem set on adrenalin.

Temp is not horrible tho, it's just 55-60/90-95c max, usually near 90.

5600x still runs fine (both cpu/gpu thermal paste were changed at the same time) on 4.65 all core with gaming temps ~55c max.

I'm just wondering what paste to try while keeping the same daily oc.


----------



## Garlic (Dec 17, 2021)

I now hold 3dmark 12900K/KF 6700XT world record, I guess not many people pair a 12900 with a 6700 lol.
AMD Radeon RX 6700 XT video card benchmark result - Intel Core i9-12900KF Processor,ASUSTeK COMPUTER INC. ROG STRIX Z690-A GAMING WIFI D4 (3dmark.com)


----------



## Nuckles56 (Dec 18, 2021)

I'm now a proud member of rx6000 club, as I picked up this guy to replace my 1080ti, it's a big card alright


----------



## nolive721 (Dec 18, 2021)

dear lord forgive me as I have sinned.

I loved my Hellhound 6700XT but it was kind of an overkill in my Gaming rig, I had an offer from someone at work who was ready to pay the equivalent of $900 for the card where I had paid only $700 back in September and I accepted the deal.

Could not leave my rig cardless obviously and I decided to go the lower end route and bought a Nitro+ 6600XT as a replacement. I know it sounds crazy but the card is amazing, it can pull my triple 1080p set-up nicely above 60fps even in AAA titles  and also my 720p Plasma TV with VSR up to 1440p also at 60fps minimum

and with Amazon current campaign over here in Japan, I could get it for $550  so not too bad deal overall

At some point when this GPU market comes back to sense I will buy an higher Tier card again to put in my Gaming rig  and move the 6600XT in the Living Room rig instead

again, don't get me wrong the 6700XT was a great card but I came to the conclusion that I didn't really need it in the End, hence the move


----------



## nolive721 (Dec 19, 2021)

thanks to the Mods then for bringing my post back

I have spent some of  my afternoon to do some Heavy gaming comparing the power draw of the 6600XT vs my 6700XT.

Same Game conditions (bunch of Simracing titles like AMS2, PC2 and ACC) focusing on OC/UV settings to minimize power consumption

Triple 1080p with Freesync @75hz                 Max draw 6700XT 120W    6600XT 135W
720p Plasma upscaled to 1440p @60Hz        Max draw 6700XT 100W    6600XT 80W

What's interesting is that at 1440p the 6600XT is even more efficient! but then I need to crank up a bit Core/Memory clock on my Triple Monitor set-up to achieve the same fps than the 6700XT

but yes overall, even if that's embarrassing bit to bit  in a lower Tier GPU crew now, I don't regret my choice. I might even put the savings I made towards a proper UW 1440p display if I can see some nice deal in the New Year Sales


----------



## GamerGuy (Dec 19, 2021)

Doesn't matter, the 6600 XT is still an RDNA2 GPU and a most welcomed addition to the club!   Just noticed my previous post had vanished....whaddupwiddat?


----------



## QuietBob (Dec 19, 2021)

nolive721 said:


> I have some spent my afternoon to do some Heavy gaming comparing the power draw of the 6600XT vs my 6700XT.
> 
> Same Game conditions (bunch of Simracing titles like AMS2, PC2 and ACC) focusing on OC/UV settings to minimize power consumption
> 
> ...



Good purchase with the 6600XT - and thanks for these numbers!

Mid-range RDNA2 cards are amazingly efficient. I have a 6600XT from AsRock, the Phantom Gaming D model, running stock clocks and voltage. I haven't tested it in the latest games, as I don't play them, but I do have some numbers for power consumption. I'm also running a 3300X oc'd to 4.5 GHz on all cores, powered by a Platinum rated PSU.

In idle, the GPU uses mere *3w*. The whole system consumes just *38w*, as measured at the wall. I ran the Forza Horizon 4 benchmark at 1080p with V-sync on and all detail settings maxed out manually. On average, the whole system was consuming *110w* at locked 60 fps.

As I don't buy into the "144+ fps or bust" craze and play mostly old titles, the 6600XT is more than enough for my gaming needs


----------



## nolive721 (Dec 20, 2021)

Thanks

I also bought the Nitro+ as it has dual BIOS even adjustable from inside Windows through their TRIXX software

it’s risky though so I will possibly do it while the PC is off

I am planning to play with MPT if I see the card struggling if I buy the UW monitor and she has to push fps at even higher resolution


----------



## Felix123BU (Dec 21, 2021)

nolive721 said:


> thanks to the Mods then for bringing my post back
> 
> I have spent some of  my afternoon to do some Heavy gaming comparing the power draw of the 6600XT vs my 6700XT.
> 
> ...


Yeah, these RDNA2 chips are insanely power efficient when tuned a bit for efficiency, I settled on 60FPS @ 3440x1440, and the power consumption is between 100 and 110w for normal games and between 120 and 150w for the most demanding ones with my 6800XT.

I mainly did this to see how much I can lower my electricity bill, and I was quite amazed by the result, the bill for the last 2 months was 30% lower on average, with everything else more or less the same, ok, I tuned the whole system for low power, but the results where a bit mind-blowing   I still keep thinking about the mantra many present when talking about PC power consumption and electricity bill consequences, that says the differences between a low power part and a high one is negligeable on the power bill, my findings strongly disprove it


----------



## nolive721 (Dec 21, 2021)

yes, I surely 2nd that one regarding power consumption. my PC is running in this Gaming room 2 display set up a triple 1080p screens so max power draw of those is only 100W and then the 1st Plasma TV I bought when I arrived in Japan 15yrs ago......that is consuming 450W/h, no kidding  
so the savings I made by selling the 6700XT and run my rig with the 6600XT are going towards a 3440x1440 monitor that will be draining max 40W/h.Win Win


----------



## GamerGuy (Dec 22, 2021)

Waiting for my RX VEGA64 Red Devil to bite it, then I'd be looking at perhaps the RX 6600 XT or RX 6700 XT, my VEGA64 struggles with newer games at 3440x1440, don't need crazy high framerate as monitor's an old Acer 75Hz Freesync model.


----------



## nolive721 (Dec 22, 2021)

Boy you are going to love the power consumption vs the VEGA

I had a 64LC and pushing boundaries could take me to 300w

Even some intense UV to maintain 5760*1080 at free sync 75hz was nowhere near 200w


----------



## Felix123BU (Dec 22, 2021)

GamerGuy said:


> Waiting for my RX VEGA64 Red Devil to bite it, then I'd be looking at perhaps the RX 6600 XT or RX 6700 XT, my VEGA64 struggles with newer games at 3440x1440, don't need crazy high framerate as monitor's an old Acer 75Hz Freesync model.


Came from a Vega64, also at 3440x1440, the Vega was barely keeping up at this resolution, I could get by, but I wanted a new one. My 6800xt is basically overkill right now, more than twice as fast and consuming less than half the power. I said it before, I am super impressed with the 6800xt, the last time I was this impressed by a card was the HD 4870 14 years ago   
The Vega is humming along just fine in a friends PC.


----------



## QuietBob (Dec 22, 2021)

I built a new system with the 6600XT and kept the old PC (in my specs). The oc'd HD7970 in there consumes *19w* in idle, but in games it could go *over 300w*. The whole FX platform took close to *450w* in some titles 

My RDNA2 rig will use perhaps a quarter of that power while providing a much better gaming experience. Loving the efficiency!


----------



## GamerGuy (Dec 23, 2021)

I don't use my Vega much as I do my gaming on my main rig, the Vega64 + i7 3960X rig is my backup rig should there be an issue with my main rig. Here's hoping that the Vega64 lasts a fair bit longer as GPU prices, especially in my neck of the woods, can be mind numbing. For example, the 'cheapest' RX 6700 XT I can find on the local Amazon listing is ~1084USD for a reference model. I find it ridiculous that some RX 6600 XT's are being sold at higher than the reference RX 6700 XT. Hopefully this GPU pricing madness ends before my Vega64 does....


----------



## QuietBob (Dec 26, 2021)

Did some more testing on my main and secondary PC. I looked at total power consumption in Mass Effect: Andromeda, checking the same scene with the same exact settings. The FX8300/HD7970 rig was consuming *410w* in game and peaked at *492w* during shader generation on first launch. The 3300X/6600XT combo was at *109w* and *151w*, respectively. 1080p with V-sync @ 60.

So far, my estimates are checking out. Almost 4x less energy used when gaming. For Earth!


----------



## Felix123BU (Dec 27, 2021)

nolive721 said:


> thanks to the Mods then for bringing my post back
> 
> I have spent some of  my afternoon to do some Heavy gaming comparing the power draw of the 6600XT vs my 6700XT.
> 
> ...


There should be no embarrassment for having any card, high or low tier, its not like having a higher tier card makes ones junk longer


----------



## Fatal Fighter (Dec 27, 2021)

If temps are fine (junction was 85-87C max), what could cause PC shutdwon during Time Spy GT2? GPU was consuming 520W+, tried to run same setting two times, same. PSU is be! quiet 1000W Platinum


----------



## Felix123BU (Dec 27, 2021)

Fatal Fighter said:


> If temps are fine (junction was 85-87C max), what could cause PC shutdwon during Time Spy GT2? GPU was consuming 520W+, tried to run same setting two times, same. PSU is be! quiet 1000W Platinum


On first thought some sort of over current protection on the Gpu, or even ocp on the Psu, you could have hit some safety limits, and GT2 is the most power intensive part of that test. 520+w sustained is quit a lot.


----------



## Bomby569 (Dec 27, 2021)

Fatal Fighter said:


> If temps are fine (junction was 85-87C max), what could cause PC shutdwon during Time Spy GT2? GPU was consuming 520W+, tried to run same setting two times, same. PSU is be! quiet 1000W Platinum



You could look at the error code in windows. Betther then guessing. Don't just assume it's the gpu, it could memory oc or a million other things, assuming stuff with PC's never helped me in the past.


----------



## Kabouter Plop (Dec 28, 2021)

Anyone else had this kind of texture popping and flickering exclusively happens in doom eternal reloading level shows it consistently restarting the game when it does not show it consistently shows no flickering.








If had these kind of issues with other games before where others showed similar problems, getting bit fed up with these issues and bug reports to amd not doing anything.


----------



## nolive721 (Dec 30, 2021)

did it. used another Amazon Japan campaign (you have to love being prime user...) and bought this one https://www.amazon.co.jp/gp/product/B09DSY4VCK/ref=ppx_yo_dt_b_asin_title_o02_s00?ie=UTF8&psc=1

its really good at improving immersion vs the Plasma TV, and 40W of electric consumption vs the 450W will make this being the cherry on the cake.

its a 3440x1440p and the 6600XT handles Games very well on it. I have managed to run my Simracing Games at 60 fps with Max graphics settings, including Assetto Corsa with SOL mod.
and this while UVing the card to 2200Mhz/1.075V so temps in low 50s and whisper quiet

I pushed the envelop to take advantage of the monitor High refresh rate and she could pull close to 90fps in some cases but of course with higher temps and noise
Might be doing this if I start to play again FPS Games obviously

Just a finding, the card is Nitro+ by the way, with my Powercolor 6700XT I could undervolt to 1.02V at similar Core frequency so that,s the only thing really missing with the 6600XT, the better UVing potential, but all the rest largely counterbalances this

So me being happy bunny right now


----------



## Felix123BU (Dec 30, 2021)

nolive721 said:


> did it. used another Amazon Japan campaign (you have to love being prime user...) and bought this one https://www.amazon.co.jp/gp/product/B09DSY4VCK/ref=ppx_yo_dt_b_asin_title_o02_s00?ie=UTF8&psc=1
> 
> its really good at improving immersion vs the Plasma TV, and 40W of electric consumption vs the 450W will make this being the cherry on the cake.
> 
> ...


Welcome to the Ultrawide world, you are going to love it


----------



## Butanding1987 (Dec 30, 2021)

Fatal Fighter said:


> If temps are fine (junction was 85-87C max), what could cause PC shutdwon during Time Spy GT2? GPU was consuming 520W+, tried to run same setting two times, same. PSU is be! quiet 1000W Platinum


Most probably a power supply issue. I have hit >700w on my 6800 XT with no problem in Time Spy Extreme. I'm using a Seasonic 1300w PSU.


----------



## Kabouter Plop (Jan 1, 2022)

Kabouter Plop said:


> Anyone else had this kind of texture popping and flickering exclusively happens in doom eternal reloading level shows it consistently restarting the game when it does not show it consistently shows no flickering.
> 
> 
> 
> ...



Seems setting decal quality from ultranightmare to nightmare instantly resolves it but it only appears 1 out of 10 times you startup doom eternal on those specific levels, and if it appears only restart fixes it not reload or setting decals to nightmare.


----------



## Fatal Fighter (Jan 2, 2022)

Butanding1987 said:


> Most probably a power supply issue. I have hit >700w on my 6800 XT with no problem in Time Spy Extreme. I'm using a Seasonic 1300w PSU.


Well, where 850PSU is not enough for this rig to run at stock settings, I tend to agree with you!


----------



## Felix123BU (Jan 2, 2022)

Fatal Fighter said:


> Well, where 850PSU is not enough for this rig to run at stock settings, I tend to agree with you!


Generally speaking a 750w is enough if its good quality and you don't go crazy with OC's. I ran my 6800xt on a Corsair CX750 (that's a average PSU) for 4 months, had 0 issues, even when doing some temporary high OC's. Many PSU's do no handle spikes very well


----------



## Fatal Fighter (Jan 3, 2022)

Felix123BU said:


> Generally speaking a 750w is enough if its good quality


EVGA 850GA Gold is at least above average I would say, but as I found out power spikes caused system to shutdown. I was not able to run Time Spy or FireStrike at stock settings. Had no issues before that with RX6800+5950X OC'ed. I was very surprised to find out PSU is not enough even on stock settings.

Small off-topic: Is it even possible to reach 19K+ CPU score in Time Spy with 5950X/FCLK 1800? Every result I saw which is 19K+ was set on FCLK 1900+ and unfortunately my CPU is not capable of that


----------



## Butanding1987 (Jan 3, 2022)

Fatal Fighter said:


> EVGA 850GA Gold is at least above average I would say, but as I found out power spikes caused system to shutdown. I was not able to run Time Spy or FireStrike at stock settings. Had no issues before that with RX6800+5950X OC'ed. I was very surprised to find out PSU is not enough even on stock settings.
> 
> Small off-topic: Is it even possible to reach 19K+ CPU score in Time Spy with 5950X/FCLK 1800? Every result I saw which is 19K+ was set on FCLK 1900+ and unfortunately my CPU is not capable of that


You might have the 1900 Fclk hole. Try higher than 1900 -- it should boot. But check for WHEAs.


----------



## Fatal Fighter (Jan 3, 2022)

Butanding1987 said:


> You might have the 1900 Fclk hole. Try higher than 1900 -- it should boot. But check for WHEAs.


Thanks, I've tried that, 2000 boots but very unstable with a lots of WHEA errors and stuff


----------



## Torcher01 (Jan 3, 2022)

Hello everyone, nice to be here.

I downloaded the program earlier for my Powercolor 6900XTU liquid Devil  Ultimate. I modded the 3 numbers per the instructions, to move the slider scale of power consumption. At first I was very pleased my max went from around 385-440W at the 15% AMD driver limit. However, I need just a little bit more power. I tried going right back in and increasing the wattage, but the numbers did not change at all. Those numbers reflect a 45% increase over standard. I don't know if there is a process to overwrite,I did hit delete sppt/write SPPT. I have masssive cooling so I'm not concerned at all with overheating. Did I hit the limit of SPPT? Thanks!


----------



## Butanding1987 (Jan 4, 2022)

Torcher01 said:


> Hello everyone, nice to be here.
> 
> I downloaded the program earlier for my Powercolor 6900XTU liquid Devil  Ultimate. I modded the 3 numbers per the instructions, to move the slider scale of power consumption. At first I was very pleased my max went from around 385-440W at the 15% AMD driver limit. However, I need just a little bit more power. I tried going right back in and increasing the wattage, but the numbers did not change at all. Those numbers reflect a 45% increase over standard. I don't know if there is a process to overwrite,I did hit delete sppt/write SPPT. I have masssive cooling so I'm not concerned at all with overheating. Did I hit the limit of SPPT? Thanks!


Watts = volts x amperes. That means you have to increase either voltage or amperage or both to get more power.


----------



## Torcher01 (Jan 4, 2022)

If the bottom two are in amperes and the top is in watts, what are you getting at? if your multiplying two larger numbers, they would increase the end result; by definition, more volts and/or amperes meens more watts. Here's the original values.

OK I found it. In the MPTool, on the first tab, hit load the standard bios, then select from the drop down your gpu. This will select where to write to and give a successfully added, once written. I'm now getting around 550Wpk. Very nice!


----------



## turbogear (Jan 6, 2022)

Hi RDNA2 owners.  
It's been a while since I was here.
I wish you all Happy New year.

I started a project since September that I never completed due to a very sad situation in family.
I was rebuilding my PC with external radiator Mo-Ra and changing from 6900XT Liquid Devil Ultimate to 6900XT OC Formula liquid cooled.
Now this has been completed in Christmas time.  

Here is the beast during benchmarking run with Mo-Ra sitting outside at balcony where it was 0°C. 



Now the big the interesting stuff.

Look at the score in Time spy. 









						I scored 26 340 in Time Spy
					

AMD Ryzen 9 5950X, AMD Radeon RX 6900 XT x 1, 32768 MB, 64-bit Windows 10}




					www.3dmark.com
				




First place worldwide: 









						3DMark.com search
					

3DMark.com search




					www.3dmark.com
				








I can't share the settings as I got help from Devcom at Hardwareluxx forum to achieve this 1st place worldwide.

So settings are copyright of Devcom.


----------



## nolive721 (Jan 6, 2022)

Respect. AMD FTW!


----------



## Felix123BU (Jan 7, 2022)

turbogear said:


> Hi RDNA2 owners.
> It's been a while since I was here.
> I wish you all Happy New year.
> 
> ...


Great results, this should make some people green with envy


----------



## Felix123BU (Jan 23, 2022)

Ok, played some more with low power tuning on my 6800XT, and behold, its now using 80w at 3440x1440@60hz   
This is totally indecent for such a powerful GPU


----------



## nolive721 (Jan 24, 2022)

Hey what Game is that ?

I would like to test my 6600XT on my UW 3440/1440p to see how much power needed to achieve 60fps

for the sake of science lol) but assuming I would like to buy this Game to also play it ah ah


----------



## Felix123BU (Jan 24, 2022)

nolive721 said:


> Hey what Game is that ?
> 
> I would like to test my 6600XT on my UW 3440/1440p to see how much power needed to achieve 60fps
> 
> for the sake of science lol) but assuming I would like to buy this Game to also play it ah ah


Skyrim Special Edition with a lot of graphic mods, making it relatively heavy on the GPU compared to what it originally is 
The fact that I have a lot of mods would make it difficult for a 100% equal comparison. But for the sake of science, a thing that is relatively easy to compare is how much power is needed for lets say Cyberpunk maxxed out at 3440x1440@60fps, for that and the latest tuning I need ~135w to keep it just above 60fps. Only setting that is not maxxed out is Screen Space Reflections, that's on High only, and all Blur and DOF crap is turned off. Everything else max possible, no RT of any kind.


----------



## GamerGuy (Jan 24, 2022)

You know, I wonder if anyone's gonna get the RX 6500 XT, and join this club, I hope he or she does, because I happen to think it's a great budget card (can be had for 200 to 270USD according to newegg) which is able to do what it was intended to do (on a PCIe 4.0 mobo). That is, to play games at reasonable graphics settings (no HD texture packs, no 'Ultra') like a mix of Low to Mid to High settings depending on the game, and still net playable and smooth framerate at 1080P and at no higher resolutions than that. I'd said the price was good, given these days of chip shortage and other factors, because at 200-270USD, it outperforms the GTX 1650 and is cheaper, heck, it's even cheaper than the crappy GTX 1050 Ti (newegg, for 300USD and higher).


----------



## LeonIncognito (Jan 24, 2022)

I have a 6600XT which I seriously enjoyed for the price at the time ($425) and with my 1080p monitor. It performed very well at that resolution, high or max and 120fps on games only a few years ago. 60fps max settings with newest games. However, AMD cards are permanently compromised with DirectX 9 games like Far Crys, Assassin´s Creed and even Hat in Time. Framerates would be all over the place no matter the settings. AMD just made bad drivers and never addressed it.

Unfortunately, when I moved to Intel 12th and a new board, I had so many driver problems I decided to jump ship when I snagged a 3070 Ti at Best Buy.


----------



## nolive721 (Jan 26, 2022)

Hello everyone, 

would another move from my current 6600XT to a 6600 model be a wise thing to do?

I bough the XT a month ago for under $550 and I see a Sapphire Pulse 6600 for around $470 here where I live.

looks like its the same chip and memory bus, only Clocks and TDP is different so I was wondering if a bit of MPT help and OCing I could get such 6600 very close to my 6600XT

I play on a UW 1440p Freesync and my GPU is heavily UVed but still get happily above the low range of my monitor Freesync so I don't mind if I get just 2 to 3fps less with the 6600

thanks in advance for the feedback including if you think I am nutts lol)


----------



## Space Lynx (Jan 26, 2022)

LeonIncognito said:


> I have a 6600XT which I seriously enjoyed for the price at the time ($425) and with my 1080p monitor. It performed very well at that resolution, high or max and 120fps on games only a few years ago. 60fps max settings with newest games. However, AMD cards are permanently compromised with DirectX 9 games like Far Crys, Assassin´s Creed and even Hat in Time. Framerates would be all over the place no matter the settings. AMD just made bad drivers and never addressed it.
> 
> Unfortunately, when I moved to Intel 12th and a new board, I had so many driver problems I decided to jump ship when I snagged a 3070 Ti at Best Buy.




did you get the 3070 ti in person at best buy?


----------



## LeonIncognito (Jan 27, 2022)

No, online. They were dropping several models spaced apart and I was missing them all near the end despite being in the queue. Got the 3070 Ti, though I really wanted a 3080 because 8GB seems to already not be enough for some games texture settings.


----------



## nolive721 (Jan 29, 2022)

is MPT on RX GPUs if you are on latest RADEON drivers? I received a RX6600 today and its Power limit is locked at 120W ( where TDP is meant to be 132W by the way) and changing this in MPT, write SPTT table and restarting my PC don't make any sort of difference.

any help appreciated


----------



## Felix123BU (Jan 29, 2022)

nolive721 said:


> is MPT on RX GPUs if you are on latest RADEON drivers? I received a RX6600 today and its Power limit is locked at 120W ( where TDP is meant to be 132W by the way) and changing this in MPT, write SPTT table and restarting my PC don't make any sort of difference.
> 
> any help appreciated


As far as I see there is no mention of 6600 support on the MPT version page, could be wrong though. Even if its not supported just now, it might be in future or in the latest beta versions. I would check the latest beta, even though that too does not mention support for the 6600.

Anyway, love MPT, best tool for AMD cards ever


----------



## nolive721 (Jan 29, 2022)

Felix123BU said:


> As far as I see there is no mention of 6600 support on the MPT version page, could be wrong though. Even if its not supported just now, it might be in future or in the latest beta versions. I would check the latest beta, even though that too does not mention support for the 6600.
> 
> Anyway, love MPT, best tool for AMD cards ever


thanks for the heads up as you are right, stable version dont capture RX6600 or 6600XT. I tried Beta version on both and you can override things like Voltage and Power in MPT but then the RADEON drivers would not allow you to go beyond the 20% power increase.

I can see the Voltage slider being updated (I had set move from 1150mV to 1175mV) but it doesn't do anything for power headroom to the GPUs unfortunately, again unless I am not using it properly (I played with my 6700XT last year and it was working well)

But anyway, for the sake of science and people who are considering a 6600 vs 6600XT, I did quite few back to back comparison as shown below in details.
My conclusion is the 6600XT is the better card at so much more efficient to achieve same FPS in benchmark  or Games tested (almost 50W!!!!) and it will have the extra performance I will need to drive my 2 different monitor set-up with future AAA titles

Heaven is 1080p and all Graphics are Maxed out in the 2 Games tested, including heavy modding in AC

Detailed results here

*RX6600 powercolor hellhound*

Undervolt/OC 2555Mhz CORE 1.097V  Memory 1750Mhz(fast) 101.5FPS 110W
DEFAUT all stock 98FPS
OVERCLOCK 2730Mhz Core 1.15V  Memory 1862Mhz 103.9FPS 120W

BENCHMARKS

ASSETTO Triple 1080p 
UV  73fps CPU use 70% 108W
DEFAULT 71fps CPU use 68% 105W
OC 75fps CPU 72%  129W


*RX6600XT SAPHIRRE NITRO+*

Undervolt 2188Mhz CORE 1.062V   103FPS    Memory 200Mhz   75W
DEFAULt all stock 116FPS
OVERCLOCK 2965Mhz Core 1.15V  Memory 2150Mhz 126FPS   148W

BENCHMARKS

ASSETTO Triple 1080p 
UV  75fps CPU use 63%   75W
DEFAULT 84fps CPU use 70%
OC 90fps CPU 74% 144W

SHADOW TOMB Triple 1080p
UV 43fps Frame 6730
DEFAUT 48fps Frame 7516
OC 49fps Frame 7628

ASSETTO UW 1440P
UV  68fps CPU use 56%  71W
DEFAULT 75fps CPU use 61%
OC 81fps CPU 67% 146W

SHADOW TOMB UW 1440P
UV 58fps Frame 9000
DEFAUT 62fps Frame 9650
OC 65fps Frame 10250


----------



## Fatal Fighter (Jan 30, 2022)

Soo, I could not help myself and purhased second Gigabyte RX6900XT Xtreme Waterforce, as it was $1500 for a brand new one.
I always had concerns regarding my first Gigabyte RX6900XT's temperatures, I am easily hitting 50-55 while playing RDR2 for example, which seemed kinda high to me.
Now, this example I purchased lately runs 12-13 degrees cooler, soo should I disassmeble the hot card and change thermal paste and tighten it harder? I do not believe such difference is just silicon lottery.
Barely ran TimeSpy GT1 (1000W platinum PSU is def. not enough for these guys, upgrading to 1500 tomorrow) and power draw and frequencies are kinda same, but 12-13 difference is bugging me a lot

Edit: actually it is even 15C difference


Spoiler: IMGs


----------



## Felix123BU (Jan 30, 2022)

Fatal Fighter said:


> Soo, I could not help myself and purhased second Gigabyte RX6900XT Xtreme Waterforce, as it was $1500 for a brand new one.
> I always had concerns regarding my first Gigabyte RX6900XT's temperatures, I am easily hitting 50-55 while playing RDR2 for example, which seemed kinda high to me.
> Now, this example I purchased lately runs 12-13 degrees cooler, soo should I disassmeble the hot card and change thermal paste and tighten it harder? I do not believe such difference is just silicon lottery.
> Barely ran TimeSpy GT1 (1000W platinum PSU is def. not enough for these guys, upgrading to 1500 tomorrow) and power draw and frequencies are kinda same, but 12-13 difference is bugging me a lot
> ...


Most likely its mounting pressure or bad paste spread, you would get by far the best results with Liquid Metal on the die instead of paste, but that must be done with much care and needs some knowledge.
I would not feel too bad about 50-55c in a demanding game like RDR2, but of course, less is better. If you disassemble it, would be a good idea to have spare thermal pads of the correct thickness, they can get damaged during disassembly.

Btw, nice looking build


----------



## Fatal Fighter (Jan 30, 2022)

Felix123BU said:


> Most likely its mounting pressure or bad paste spread, you would get by far the best results with Liquid Metal on the die instead of paste, but that must be done with much care and needs some knowledge.
> I would not feel too bad about 50-55c in a demanding game like RDR2, but of course, less is better. If you disassemble it, would be a good idea to have spare thermal pads of the correct thickness, they can get damaged during disassembly.
> 
> Btw, nice looking build


I was totally thinking about changing thermal pads and even using LM, but idk what thickness pads needed, so thought I might reuse old ones plus kinda lazy to do it or think wont have much effect. I have seen great results with LM and know how careful you need to be, for obvious reasons coming from its name  I think I will use it sooner or later, just hesitating for now.

Thanks for the reply and liking my build!


----------



## nolive721 (Jan 31, 2022)

nolive721 said:


> thanks for the heads up as you are right, stable version dont capture RX6600 or 6600XT. I tried Beta version on both and you can override things like Voltage and Power in MPT but then the RADEON drivers would not allow you to go beyond the 20% power increase.
> 
> I can see the Voltage slider being updated (I had set move from 1150mV to 1175mV) but it doesn't do anything for power headroom to the GPUs unfortunately, again unless I am not using it properly (I played with my 6700XT last year and it was working well)
> 
> ...


the RX6600 is going back and I will be keeping the 6600XT until new and 2nd hand market prices come back to at least pre-COVID.

I am in the RED camp owning both CPU^GPU but I am not sure how they can convince people with current pricing if the Power efficiency vs Performance ratio is so far away from the XT variant?

maybe they bank on the fact that VEGA and RDNA1 are worse but that's not very customer base friendly.

anyway End of rant, I keep my 6600XT Nitro+ working in UV mode for some time and enjoy before the next upgrade

PS Last night I was playing RF2 one of my Simracing titles in UW1440p mode, no FSR, all Graphics maxed out and it was the 1st time I could see on a track the VRAM getting at 8Gb.I will need to keep an eye on this


----------



## Fatal Fighter (Feb 7, 2022)

I am confused af. Is 1500W Titanium PSU not enough for two OC’d 6900XTs?? OC settings are 2600/2700 +15% power limit. No issues on stock. The PC shuts down on GT2 in Time Spy all the time. I can only some weird power spikes maybe, but integrated monitoring widget never shows power usage higher than 400w per each card.


----------



## GamerGuy (Feb 12, 2022)

Fatal Fighter said:


> I am confused af. *Is 1500W Titanium PSU not enough for two OC’d 6900XTs??* OC settings are 2600/2700 +15% power limit. No issues on stock. The PC shuts down on GT2 in Time Spy all the time. I can only some weird power spikes maybe, but integrated monitoring widget never shows power usage higher than 400w per each card.


Why are you on a dual RX 6900 XT CF setup when CF (like SLi) is basically dead in the water? I used to be a CF/SLi man myself, having owned dual GPU setups from time to time, last AMD CF cards were a pair of GB RX Vega64 Gaming OC, and a pair of GTX Titan. These were powered by either my Seasonic X-1250 or my Enermax MAX REVO 1500W PSU.

With regard to your present conundrum, it may be so. Reason I said this is because even with a single mildly OC'ed Nitro+ RX 6900 XT, my system would occasionally restart (very seldom, which makes it hard to troubleshoot as it would not be repeated when go into desktop again). The funny thing is, I can game all day, or at least 2-3 hours, and have absolutely no issue. But it'd be entirely possible that my rig would restart simply while I'm on desktop the very next time around. Never happened when I had a single PC Vega64 Red Devil in it (before upgrading to my present card) but occasionally do so now. It's so frustrating, but it happens so randomly, and so very seldom, that I'd given up trying to troubleshoot it. 

BTW, I have a Corsair HX1000 Plat, which by right, should be more than sufficient for any single flagship GPU now. Oh yeah, are you running both cards OC'ed? If so, why not try running them at default setting first to see if they're stable to begin with?


----------



## Fatal Fighter (Feb 12, 2022)

GamerGuy said:


> Why are you on a dual RX 6900 XT CF setup when CF (like SLi) is basically dead in the water? I used to be a CF/SLi man myself, having owned dual GPU setups from time to time, last AMD CF cards were a pair of GB RX Vega64 Gaming OC, and a pair of GTX Titan. These were powered by either my Seasonic X-1250 or my Enermax MAX REVO 1500W PSU.
> 
> With regard to your present conundrum, it may be so. Reason I said this is because even with a single mildly OC'ed Nitro+ RX 6900 XT, my system would occasionally restart (very seldom, which makes it hard to troubleshoot as it would not be repeated when go into desktop again). The funny thing is, I can game all day, or at least 2-3 hours, and have absolutely no issue. But it'd be entirely possible that my rig would restart simply while I'm on desktop the very next time around. Never happened when I had a single PC Vega64 Red Devil in it (before upgrading to my present card) but occasionally do so now. It's so frustrating, but it happens so randomly, and so very seldom, that I'd given up trying to troubleshoot it.
> 
> BTW, I have a Corsair HX1000 Plat, which by right, should be more than sufficient for any single flagship GPU now. Oh yeah, are you running both cards OC'ed? If so, why not try running them at default setting first to see if they're stable to begin with?


I know mGPU is not the thing anymore unfortunately ( never had been  “a big deal” in fact) I just could not resist myself from buying second one as it was for $1550 and I am kinda a pc/upgrade addicted  There is no issue on stock whatsoever, but was with 1000w, which was not enough for these hungry boys.
I am OC enthusiast and wanted to play around with these cards in some benchmarks, but havent tried yet other than Time Spy, which just shuts my PC down at 2600/2700. One card is 15C warmwer than the other, I really dont think this could somehow be connected with my issue, but I am going to reapply thermal paste anyways when adjustable fitting comes and see. If hwinfo or any other monitoring softs logs the data and saves, I would look it up and see if there is indeed power spikes, which is a thing for a reference 6900XT. Thanks for the reply! Will post one more time this regard as my vacation starts tomorrow and I will have more time to investigate.


----------



## GamerGuy (Feb 12, 2022)

Holy crap, a pair of  GB RX 6900 XT Xtreme WaterForce WB's!!! So jelly!!!


----------



## Fatal Fighter (Feb 12, 2022)

looks good but not that practical


----------



## Felix123BU (Feb 12, 2022)

Fatal Fighter said:


> I am confused af. Is 1500W Titanium PSU not enough for two OC’d 6900XTs?? OC settings are 2600/2700 +15% power limit. No issues on stock. The PC shuts down on GT2 in Time Spy all the time. I can only some weird power spikes maybe, but integrated monitoring widget never shows power usage higher than 400w per each card.


It should be, theoretically, that's 750w max per card, and even less if you subtract the rest of the system and the way the power rails are split. The recommended minimum Psu for a 6900xt is 850w, so not sure if a 1500w is enough for 2 water cooled  and OC-ed 6900xt's.

I would try running time spy with only one Gpu, full power possible, disable the other one while testing. Gt2 can suck up to 380w constant on my 6800xt, for a 6900xt that should be even more without spikes. Just to make sure a) psu can at least handle one and b) if at least one runs without issues, to check if both Gpu's can run at the same settings stable.

Too high frequency settings can also cause crashed in Timespy, and generally any unstable OC can also cause crashes in Gt2 rather easy.

Not all 6900xt's can run 2600-2700mhz stable.


----------



## Fatal Fighter (Feb 12, 2022)

Felix123BU said:


> It should be, theoretically, that's 750w max per card, and even less if you subtract the rest of the system and the way the power rails are split. The recommended minimum Psu for a 6900xt is 850w, so not sure if a 1500w is enough for 2 water cooled  and OC-ed 6900xt's.
> 
> I would try running time spy with only one Gpu, full power possible, disable the other one while testing. Gt2 can suck up to 380w constant on my 6800xt, for a 6900xt that should be even more without spikes. Just to make sure a) psu can at least handle one and b) if at least one runs without issues, to check if both Gpu's can run at the same settings stable.
> 
> ...


Minimum requiremet for this particular version is 900watt psu even, but thats for the whole system and that’s why I think it kinda should be enough (for 2700freq. at least), but not 100% sure. I think I need to see logs of power usage of cards before shut down I guess.
Each card runs 2750mhz without issues, even though one card is way hotter than the other. 
Actually I have two PSUs at home  damn I totally forgot about that I didnt sell my 1000w psu yet. Tomorrow starts my vacation, disassambeling my cards and testing with two PSUs as well 
Will post here later


----------



## Fatal Fighter (Feb 14, 2022)

Looks like it is silicon lottery after all :| 
I disassembled the card, re-applied thermal paste and made sure mounting issue is good, but no joy.
One thing that also came to my mind is maybe it is factory defected base plate? Like it is not touching die as it should. I made some photos, maybe someone will notice anything here.

But 90% it is just bad luck with silicon lottery.

As I mentioned in earlier post,  I have two of these cards - Gigabyte 6900XT Xtreme Waterforce


Thermals after running Time Spy GT2

"Bad card" 





"Good card"


----------



## chris189 (Feb 14, 2022)

Anyone know how to BIOS mod the 6600 XT?  I'm sure it's not possible yet.  Otherwise anyone know about a custom copper backplate for the Gigabyte Eagle AMD Radeon RX 6600 XT?


----------



## CMP01 (Feb 14, 2022)

After years of being shoehorned, one way or another, into Intel/Nvidia dominance (re gaming laptops and higher end desktop) I was amped at AMD's slow but steady come back rise from the ruins.
And haven't they done well too? First giving Intel a much needed boot in the nethers, then for the first time in a long time matching blow for blow to the top with Nvidia this gen.
So here I am now, my first all AMD system ever, first AMD CPU or GPU since the noughties... and it rocks. Just my opinion (and one that's seen me come off the worst in a few forums) The RDNA2 lineup, especially the top half, is easily as worthy as the Nvidia one. Too many ppl took the overhype pill and run on old AMD are hot, hungry, have poor drivers myths and RT and proprietary DLSS as proof of success alone but I say screw 'em. A handful of games supported does not a future standard yet make (though it will be the future... in the future) nm that only a year after launch games are already making RT and DLSS an either/or option for Ampere. Then there's the price probs stemming from all this, my 6800XT was half the price of (and way more available by two months or more) a 3080 that trades blows outside of RT perf that might be noticeable. I had the choice of a system with the former and other great parts or a 3080 with... not much else, 'nuff said.

Roll on RDNA3 and Zen 4.


----------



## turbogear (Feb 15, 2022)

chris189 said:


> Anyone know how to BIOS mod the 6600 XT?  I'm sure it's not possible yet.  Otherwise anyone know about a custom copper backplate for the Gigabyte Eagle AMD Radeon RX 6600 XT?


Moding the bios of RDNA2 cards is not possible. 
The good community at Igor‘s Lab tried it but I think gave up as it does not work.


----------



## chris189 (Feb 15, 2022)

Here's my results from my failed attempt at thermal modding the Gigabyte Eagle AMD Radeon RX 6600 XT.  Then after proper results.

This card I got from a miner on eBay for $575 & I get continuous glitching on the HDMI or display corruption sometimes.

Now the hotspot went from 100°C to 79°C from 1.150v to 1.025v using MPT MorePowerTool.


----------



## turbogear (Feb 15, 2022)

chris189 said:


> Here's my results from my failed attempt at thermal modding the Gigabyte Eagle AMD Radeon RX 6600 XT.  Then after proper results.
> 
> This card I got from a miner on eBay for $575 & I get continuous glitching on the HDMI or display corruption sometimes.
> 
> ...


To be honest, I am not too sure if the glitches that you see are due to thermal issue or something different wrong with the card. 
Make sure that the VRAM have proper contact with the heatsink.
VRAM data corruptions can cause an image issue like you show in the photos.

I have something like that sometimes when I am too aggressive with OCing the VRAM.
I have 6900XT OC Formula which is flashed with the bios from AMD 6900XT reference LC card that offers 18Gbps memory on it. 
With that bios I unlock higher VRAM speed. 
This bios can‘t be flashed without a special programmer and is only compatible with 6900 XTXH cards.


----------



## chris189 (Feb 15, 2022)

@turbogear Thanks.  I wish the guy before me didn't overclock the VRAM so much when mining then.  I should RMA the card huh?


----------



## QuietBob (Feb 15, 2022)

chris189 said:


> Here's my results from my failed attempt at thermal modding the Gigabyte Eagle AMD Radeon RX 6600 XT.  Then after proper results.
> 
> This card I got from a miner on eBay for $575 & I get continuous glitching on the HDMI or display corruption sometimes.
> 
> ...


I'm really glad you sorted out the temps, but... 6500 rpm down to 4500, holy cannoli!  
I've never seen (or heard) such fan speeds on mine. But then I guess you don't mind the noise, what with those six case fans and a delta on the CPU?


----------



## turbogear (Feb 16, 2022)

chris189 said:


> @turbogear Thanks.  I wish the guy before me didn't overclock the VRAM so much when mining then.  I should RMA the card huh?


That‘s a good question if the previous owner was OCing the card during mining 24x7.
@Felix123BU can tell you more about mining and how they operate cards there. 

If you are randomly getting screen corruptions maybe you can try to RMA it, but do you still have warranty on that?
In Germany many shops don‘t honor the warranty if you buy the card from third party on eBay except if you get a written form from the previous owner declaring that he transferred his property to you. We call this in Germany *Abtretungserklärung*.


----------



## chris189 (Feb 16, 2022)

Yes still have warranty.  Thanks I just wish the Core was Synced up with the hotspot or actually the other way around.  Cool the Hotspot down to as hot as the Core.  I need to add a full length copper backplate & replace the fans on the Gigabyte Eagle, all 3 80mm fans with 3x 55cfm 70mm Deltas for 165 cfm.  Delta AFC0712D.
GV-R66XTEAGLE-8GD
Warranty Coverage: 08/13/2021~09/07/2024


----------



## turbogear (Feb 16, 2022)

chris189 said:


> Yes still have warranty.  Thanks I just wish the Core was Synced up with the hotspot or actually the other way around.  Cool the Hotspot down to as hot as the Core.  I need to add a full length copper backplate & replace the fans on the Gigabyte Eagle, all 3 80mm fans with 3x 55cfm 70mm Deltas for 165 cfm.  Delta AFC0712D.
> GV-R66XTEAGLE-8GD
> Warranty Coverage: 08/13/2021~09/07/2024
> View attachment 236821


If thermals are your issue, maybe you can consider water cooling.
I am not sure if there is a block or AIO available for your model.
Bykski often offer variety of block for different cards.
My 6900XT has also block from them.  

Here you will see my setup:  








						The RX 6000 series Owners' Club
					

Did some more testing on my main and secondary PC. I looked at total power consumption in Mass Effect: Andromeda, checking the same scene with the same exact settings. The FX8300/HD7970 rig was consuming 410w in game and peaked at 492w during shader generation on first launch. The 3300X/6600XT...




					www.techpowerup.com


----------



## chris189 (Feb 16, 2022)

I removed the stock core screws with the thread stops on them with regular screws & dropped the core & hotspot by -3°C, from 60°C core to 57°C & from 79°C hotspot to 76°C hotspot.


----------



## Fatal Fighter (Feb 16, 2022)

Fatal Fighter said:


> Looks like it is silicon lottery after all :|
> I disassembled the card, re-applied thermal paste and made sure mounting issue is good, but no joy.
> One thing that also came to my mind is maybe it is factory defected base plate? Like it is not touching die as it should. I made some photos, maybe someone will notice anything here.
> 
> ...


I thought I found whats wrong, but I am just wrong, even after cleaning this heavily clogged fins, the temps are kinda the same. I have accepted now that this is just what  card’s thermals looks like. Still surprised that being clogged that much didnt affect thermals…


----------



## chris189 (Feb 16, 2022)

I tested max voltage on the Gigabyte Eagle AMD Radeon RX 6600 XT which is stock voltage 1.150v & max clock from 2700mhz to 2800mhz range & 2200mhz memory & the hotspot dropped from 100°C to 92°C, which is a solid -8°C just from the pressure alone.




*The hotspot on these card gets hotter & hotter obviously the more you overclock but they really need to release a BIOS update limiting the hotspot to 84°C rather than 110°C... IDIOTIC AMD REASONING! Please anyone who owns the GIGABYTE EAGLE card, or any GIGABYTE AMD card get on eSupport & request a BIOS UPDATE. If enough of the community request it, they will be forced to release it.*








*
temps overclocked 2675-2775 core 2200 memory tight timings 14000mv memory*


----------



## Butanding1987 (Feb 16, 2022)

Fatal Fighter said:


> I thought I found whats wrong, but I am just wrong, even after cleaning this heavily clogged fins, the temps are kinda the same. I have accepted now that this is just what  card’s thermals looks like. Still surprised that being clogged that much didnt affect thermals…View attachment 236847
> View attachment 236848


Is that the Extreme Waterforce block? If I'm not mistaken, the regular Waterforce blocks are made of nickel-plated aluminum. I didn't know the Extreme Waterforce also suffers from the same problem?


----------



## turbogear (Feb 16, 2022)

Fatal Fighter said:


> I thought I found whats wrong, but I am just wrong, even after cleaning this heavily clogged fins, the temps are kinda the same. I have accepted now that this is just what  card’s thermals looks like. Still surprised that being clogged that much didnt affect thermals…View attachment 236847
> View attachment 236848


Which cooling liquid are you using?
I had ones clogging issue with EK light blue liquid. It clogged all the fins in CPU and GPU blocks. 
That dark color on on the block looks like corrosion.  
Looks like the nickel platting is somehow reacting with something inside the cooling fluid.
I recommend that you use different cooling fluid.

I use nowadays Aquacomputer Double Protect Ultra.


			https://www.caseking.de/aqua-computer-double-protect-ultra-5l-kanister-blau-wazu-447.html?sPartner=999&gclid=Cj0KCQiAu62QBhC7ARIsALXijXR_IsBqLpYjQFV2BA-0UX8sU37_Sl6AsoRtd0VcvLmr9v-b1A7OkRYaAuylEALw_wcB
		


Regarding what you mentioned in a previous post about temperature differences between the two cards, that is kind of problem multiple people faced with different factory water block fitted cards. 
Liquid Metal usually will help. Reapplying the past did not help other people as well.
In any case all serious overlocker of 6900XT especially XTXH cards use LM.  

You need to cool these babies really well to get a good benchmark run.
If Temperatures are going above 90°C for these cards then you can kind of forget very high scores.
The performance start dropping as temperature goes higher above even 70°C on Hotspot.

I have external  radiator Mo-Ra attached to my system and to get the results I show in post below, I need to get Hotspot temperature in the range of 15°C before I start benchmark by putting Mo-Ra outside on the balcony where temperature was -2°C. 








						The RX 6000 series Owners' Club
					

Did some more testing on my main and secondary PC. I looked at total power consumption in Mass Effect: Andromeda, checking the same scene with the same exact settings. The FX8300/HD7970 rig was consuming 410w in game and peaked at 492w during shader generation on first launch. The 3300X/6600XT...




					www.techpowerup.com
				




Regarding PSU, if you want to run dual GPU setup than 1500W would get into trouble because of the higher power spikes caused by this setup.
There are some experts at Hardwareluxx forum who run multiple 6900XT but they combine more than one PSU together to prevent system from shutting down when benching.


----------



## chris189 (Feb 16, 2022)

Is my score decent for a GIgabyte Eagle AMD Radeon RX 6600 XT?


----------



## turbogear (Feb 16, 2022)

chris189 said:


> Is my score decent for a GIgabyte Eagle AMD Radeon RX 6600 XT?
> View attachment 236857


I am not too sure about what is the good score for 6600XT cards, but if you want you can press the Compare Results Online button and then look how good yours is compared to the other 6600XT that other people benched.

My 6900XT is still the second best 6900XT worldwide in TimeSpy. 
A friend Devcom at Hardwareluxx was able to push about 120 points more from his card and got the first place now.
I did not get the time to push it further in the last two weeks as I was sick with COVID. 
Now the temperatures outside are not in the negative degrees to try to push it higher.
I need at least -2°C on the balcony to try to push it higher by putting my Mo-Ra outside.


----------



## chris189 (Feb 16, 2022)

@turbogear Why doesn't my result show up in the results page when I'm looking at all other 3700x 6600 xt scores?


----------



## turbogear (Feb 16, 2022)

chris189 said:


> @turbogear Why doesn't my result show up in the results page when I'm looking at all other 3700x 6600 xt scores?


Maybe it is too low to appear on the first page. 
If you are logged in to your 3DMark account than usually there is a floating windows which indicates where your score would be compared to the others.


----------



## Butanding1987 (Feb 16, 2022)

turbogear said:


> I am not too sure about what is the good score for 6600XT cards, but if you want you can press the Compare Results Online button and then look how good yours is compared to the other 6600XT that other people benched.
> 
> My 6900XT is still the second best 6900XT worldwide in TimeSpy.
> A friend Devcom at Hardwareluxx was able to push about 120 points more from his card and got the first place now.
> ...


I think biso biso has dethroned devcom, so you're No. 3 now.


----------



## turbogear (Feb 16, 2022)

Butanding1987 said:


> I think biso biso has dethroned devcom, so you're No. 3 now.


Seems like I missed that one as I was out of order last two weeks due to Covid infection. 

But I am still second place with 5950X and 6900XT combination. 
Maybe me and Devcom need to move to Intel Core i9-12900K.


----------



## Butanding1987 (Feb 16, 2022)

turbogear said:


> Seems like I missed that one as I was out of order last two weeks due to Covid infection.
> 
> But I am still second place with 5950X and 6900XT combination.
> Maybe me and Devcom need to move to Intel Core i9-12900K.


Go, go, go. I'm rooting for you.


----------



## Fatal Fighter (Feb 16, 2022)

Butanding1987 said:


> Is that the Extreme Waterforce block? If I'm not mistaken, the regular Waterforce blocks are made of nickel-plated aluminum. I didn't know the Extreme Waterforce also suffers from the same problem?


It says copper based plate on the website, it is heavy af and I cant really say how to know if its copper or aluminium other than the weight really. But I have seen posts on reddit that 3080’s waterblock is made of aluminum…


turbogear said:


> Which cooling liquid are you using?
> I had ones clogging issue with EK light blue liquid. It clogged all the fins in CPU and GPU blocks.
> That dark color on on the block looks like corrosion.
> Looks like the nickel platting is somehow reacting with something inside the cooling fluid.
> ...


I was using EK cooling indeed, some white one which I accidentaly purchased as I thought I chose transperent one  but like for 4 months or less really, I was shocked to find waterblocks this heavily clogged after such period of time, both gpu and cpu, it took me 6-7 hours yesterday for the whole disassembling and cleaning process.
No corrosion though, yay 
Funny thing that the other card I purchased lately is doing great temperature wise, thats the fact that made to reconsider thermals of the first one.
And regarding PSU I think thats totally true indeed.
Thanks for the reply! I guess I will try to do some benchmarking today with two PSUs.
Girlfriend needs gpu upgrade I guess she is getting this one


----------



## chris189 (Feb 16, 2022)

@turbogear I looked it up, for the 3700x & 6600 xt I'm #2 now but I don't show up in any list for any of my systems.  Is it because I'm using the Free Demo?


----------



## Felix123BU (Feb 16, 2022)

turbogear said:


> That‘s a good question if the previous owner was OCing the card during mining 24x7.
> @Felix123BU can tell you more about mining and how they operate cards there.
> 
> If you are randomly getting screen corruptions maybe you can try to RMA it, but do you still have warranty on that?
> In Germany many shops don‘t honor the warranty if you buy the card from third party on eBay except if you get a written form from the previous owner declaring that he transferred his property to you. We call this in Germany *Abtretungserklärung*.


Mining should normally not be able to damage a card if the one who used it had any idea what he was doing, and vram can be pushed up to a point, too much and it will give instability, and with these cards the limits for mining is not something that would damage a card normally. Anyway, running a card with full stock settings should give an answer regarding stability, if its stable, its fine, any OC is a gamble anyway, mining card or not.


----------



## Fatal Fighter (Feb 16, 2022)

What 2500W benchmarking looks like


----------



## turbogear (Feb 16, 2022)

Fatal Fighter said:


> What 2500W benchmarking looks like
> 
> View attachment 236889


How many Rads do you have to cool that monster? 

I see it begging you to add some Mo-Ra to the setup. 






						MO-RA3 Serie
					

MO-RA3 Serie: MO-RA3 420 PRO white - MO-RA3 360 PRO white - MO-RA3 360 Core - MO-RA3 420 PRO stainless steel - MO-RA3 360 PRO black - MO-RA3 360 PRO stainle




					shop.watercool.de


----------



## Fatal Fighter (Feb 16, 2022)

Oh I was thinking about that few months ago, I think I heard about Mo-Ra for the first time from you here 
Right now using 360mm/54mm and 420/80mm rads


----------



## turbogear (Feb 16, 2022)

chris189 said:


> @turbogear I looked it up, for the 3700x & 6600 xt I'm #2 now but I don't show up in any list for any of my systems.  Is it because I'm using the Free Demo?


That could be the case. I am not too sure about it. 
I have a standalone license version of 3DMark. I bought it when they had it on offer for just a few euros.



Fatal Fighter said:


> Oh I was thinking about that few months ago, I think I heard about Mo-Ra for the first time from you here
> Right now using 360mm/54mm and 420/80mm rads


I was also not into MO-RA before but joined that club a few months ago encouraged by the OC community at Hardwareluxx.








						[Sammelthread] - Offizieller  AMD  [[ RX6700 // RX 6700XT // 6750XT // X6800 // 6800XT // 6900XT // 6950XT]] Overclocking und Modding Thread  [[ Wakü - Lukü - LN2]]
					

@daHiT78 hattest du mal nach den Vram Temperaturen geschaut ( HWInfo )? Eventuell langt ja ein Upgrade der Pads.




					www.hardwareluxx.de
				




There is also a discussion thread just about MO-RA:








						[Sammelthread] - Der offizielle MO-RA3 Sammelthread
					

Der offizielle Watercool MO-RA3 Sammelthread   Inhaltsverzeichniss:    MO-RA3 LC (Core) MO-RA3 LT MO-RA3 Pro MO-RA3 420LT    Reviews Links   Einleitung:  Am 7. Mai 2010 stellte Watercool den sehnlichst herbeigewünschten Nachfolger des sehr beliebten MO-RA2, den MO-RA3 vor.    Hier nun der...




					www.hardwareluxx.de


----------



## Fatal Fighter (Feb 16, 2022)

I am into it even right now, but I cant really find a place for it, I mean I dont have balcony in my room, otherwise I would already have been a proud owner of it


----------



## turbogear (Feb 16, 2022)

By the way, how is the flow rate of your pump with multiple blocks.
The more blocks you have the more restriction especially D5 pump has lower head pressure than DDC.
I have two DDC pumpe in series; one mounted inside the case and other on the MO-RA. 
My flow rate is at 250 l/h while running the pumps at 85%.   
If the flow rate is low I think below 120l/h then the temperatures are worst.

You said one card is hotter than the other. 
Did you check if it makes difference if you swap the position of the cards?
I mean with your setup the water enters and leave the two cards in opposite direction. 
On lower card input port is towards the front of the case and on the upper card it is the output port that is towards the front of the case.
I never had multiple GPUs in the loop so I am not sure what is optimum way to connect them together in the loop.


----------



## Fatal Fighter (Feb 16, 2022)

turbogear said:


> By the way, how is the flow rate of your pump with multiple blocks.
> The more blocks you have the more restriction especially D5 pump has lower head pressure than DDC.
> I have two DDC pumpe in series; one mounted inside the case and other on the MO-RA.
> My flow rate is at 250 l/h while running the pumps at 85%.
> ...


I cant check water flow, but it is somewhat decent as when I change the pump speed it does have good effect on temperature overall, water stays below 37-38C
I actually did swap cards yesterday  it made a small difference indeed and the rest  is just worse silicon imho.


----------



## turbogear (Feb 16, 2022)

I have the MO-RA connected over quick release connector so that I can operate computer also without it with two internal rads.
One time I did make a mistake and coupled the connectors in the wrong direction so the two pumps were pushing against each other and in minutes the temperature of GPU jumped to 110°C on turning on Time Spy and computer shut down. 
I noticed the mistake right away and did the coupling correctly again. 



Fatal Fighter said:


> I cant check water flow, but it is somewhat decent as when I change the pump speed it does have good effect on temperature overall, water stays below 37-38C
> I actually did swap cards yesterday  it made a small difference indeed and the rest  is just worse silicon imho.


37°C is on the high side actually but you have two power hungry GPUs in the loop.   
As far as I remember even when Mo-Ra is in the room my loop temperature are in the range of 31°C under stress test.


----------



## skizzo (Feb 16, 2022)

Fatal Fighter said:


> It says copper based plate on the website, it is heavy af and I cant really say how to know if its copper or aluminium other than the weight really. But I have seen posts on reddit that 3080’s waterblock is made of aluminum…
> 
> I was using EK cooling indeed, some white one which I accidentaly purchased as I thought I chose transperent one  but like for 4 months or less really, I was shocked to find waterblocks this heavily clogged after such period of time, both gpu and cpu, it took me 6-7 hours yesterday for the whole disassembling and cleaning process.
> No corrosion though, yay
> ...



damn that is one of the grossest looking water blocks I've ever seen lol
So I hope you can clarify, you were using one of EK's "solid" aka opaque coolants? specifically the white one, which is called cloud white if I not mistaken?
I have a similar negative story with that coolant, but it is literally the only store bought coolant I've ever used. I otherwise use plain distilled water with biocide manually added by myself and that has always done well for me. That EK cloud white stuff gunked up in a particular corner of my GPU block and left a thin residue on like every o-ring of everything. Fins were not clogged and temps were not impacted though. I only noticed it due to it kinda looked like tartar build up on teeth and I could see it since the blocks tops are both plexi. It wasn't like cement and hard to remove or anything, I just poked it with a screwdriver and broke right down and the blocks, fittings, whole loop cleaned up fine. This gunk build up looked nothing bigger than someone flicking a nose booger into the inside of my block. But this was say 1 year of use and It was changed part way through that time. I don't think there isn't anything "wrong" with their coolant, this seems to be a general challenge to deal with with any opaque coolant regardless of the brand or specific line of coolant. I come to this conclusion from hearing others stories similar to mine. I don't think I would use colored transparent ones either...I'm under the impression the color stuff, and especially the stuff that goes in them to make them solid/opaque is what causes these gunked up issues. So you have a pretty extreme situation going on there. Not only is that block disgustingly gunked up, it did it in a record time frame. I would believe that could happen in 2 years of never changing the fluid....but for around 4 months? no way! and it didn't look like that color either....my "gunk" was literally white....what a surprise right lol. seeing it is that blue/green tinge makes me think you could have some other reactions going on in your loop. the white discoloration and/or film build up on the block or inner plexi is something I noticed on my blocks and that cleaned up OK. so I see you got it cleaned up to your liking, but this is my long winded way of saying I think it is possible you got a different issue going on bc that is a lot of shit build up of a suspicious color in a short time to me.


----------



## chris189 (Feb 16, 2022)

How are others achieving 2900mhz core clock & 2300mhz memory?  I know 2900mhz core clock needs more than stock 1.15v for the core & idk about memory, stock memory voltage is 1.35v so idk which voltage is ideal to achieve 2300mhz memory.  Highest I bench at is between this range 2688-2788mhz core & 2222mhz memory on fast timings.

Thanks


----------



## Fatal Fighter (Feb 16, 2022)

skizzo said:


> So I hope you can clarify, you were using one of EK's "solid" aka opaque coolants? specifically the white one, which is called cloud white if I not mistaken?


YES that one! And the worst part is that was an accidental purchase 


skizzo said:


> I have a similar negative story with that coolant, but it is literally the only store bought coolant I've ever used. I otherwise use plain distilled water with biocide manually added by myself and that has always done well for me. That EK cloud white stuff gunked up in a particular corner of my GPU block and left a thin residue on like every o-ring of everything. Fins were not clogged and temps were not impacted though. I only noticed it due to it kinda looked like tartar build up on teeth and I could see it since the blocks tops are both plexi. It wasn't like cement and hard to remove or anything, I just poked it with a screwdriver and broke right down and the blocks, fittings, whole loop cleaned up fine. This gunk build up looked nothing bigger than someone flicking a nose booger into the inside of my block. But this was say 1 year of use and It was changed part way through that time. I don't think there isn't anything "wrong" with their coolant, this seems to be a general challenge to deal with with any opaque coolant regardless of the brand or specific line of coolant. I come to this conclusion from hearing others stories similar to mine. I don't think I would use colored transparent ones either...I'm under the impression the color stuff, and especially the stuff that goes in them to make them solid/opaque is what causes these gunked up issues. So you have a pretty extreme situation going on there. Not only is that block disgustingly gunked up, it did it in a record time frame. I would believe that could happen in 2 years of never changing the fluid....but for around 4 months? no way! and it didn't look like that color either....my "gunk" was literally white....what a surprise right lol. seeing it is that blue/green tinge makes me think you could have some other reactions going on in your loop. the white discoloration and/or film build up on the block or inner plexi is something I noticed on my blocks and that cleaned up OK. so I see you got it cleaned up to your liking, but this is my long winded way of saying I think it is possible you got a different issue going on bc that is a lot of shit build up of a suspicious color in a short time to me.


Well believe me I was shocked to  discover that and I think that greenish tint comes from the coolant itself, because I remember it having some kind of green thingies inside of it, like really really small and I thought I just needed to shake it, like thats how it works lol
I was never goingv for opaque liquid just a stupid mistake. 
I have seen tons of posts with such issues but that was happening like after a year or two as you said lol
Maybe my particular coolant was "factory defected" or something idk.

Using Corsair's XL8 now, hopefully no issues with it


----------



## turbogear (Feb 16, 2022)

chris189 said:


> How are others achieving 2900mhz core clock & 2300mhz memory?  I know 2900mhz core clock needs more than stock 1.15v for the core & idk about memory, stock memory voltage is 1.35v so idk which voltage is ideal to achieve 2300mhz memory.  Highest I bench at is between this range 2688-2788mhz core & 2222mhz memory on fast timings.
> 
> Thanks


For the 6900XT to achieve these GPU and memory clocks there are two things needed. 
For memory to scale that high I flashed using an external programmer a bios from the 6900XT LC AMD refence card.
This bios is only compatible with 6900XT XTXH cards and is from the 18Gbps memory cards.

For achieving GPU clock in range of 2900MHz, there is a feature in the MPT that applies higher voltage to the core depending on the Hotspot temperature.  
That‘s why I cool down the Hotspot to around 15°C by putting the Mo-Ra on balcony where in winter I had down to -2°C.
This year until now it was not that cold. Usually we get down to -10°C. 



Fatal Fighter said:


> YES that one! And the worst part is that was an accidental purchase
> 
> Well believe me I was shocked to  discover that and I think that greenish tint comes from the coolant itself, because I remember it having some kind of green thingies inside of it, like really really small and I thought I just needed to shake it, like thats how it works lol
> I was never goingv for opaque liquid just a stupid mistake.
> ...


I also one time tried this EK opaque light blue.
This is the worst coolant. I had the same issues with sediments inside the fine fins of GPU and CPU blocks. 

I used now since long time XSPC EC6 which is really good and had no issues.
In the last months I could not buy it anymore as it was often sold out everywhere in Germany.
This is now first time I try Aquacomputer Double Protect Ultra. I hope it is also as good as XSPC which I used for years.
At least I asked some of the Hardwareluxx forum members who have been using this one for longer time and they had no complains about it.


----------



## Felix123BU (Feb 16, 2022)

turbogear said:


> For the 6900XT to achieve these GPU and memory clocks there are two things needed.
> For memory to scale that high I flashed using an external programmer a bios from the 6900XT LC AMD refence card.
> This bios is only compatible with 6900XT XTXH cards and is from the 18Gbps memory cards.
> 
> ...


"For achieving GPU clock in range of 2900MHz, there is a feature in the MPT that applies higher voltage to the core depending on the Hotspot temperature" - where and what feature is that?   
I confess not touching MPT for some time, but this feature sound really sweet!


----------



## turbogear (Feb 16, 2022)

Felix123BU said:


> "For achieving GPU clock in range of 2900MHz, there is a feature in the MPT that applies higher voltage to the core depending on the Hotspot temperature" - where and what feature is that?
> I confess not touching MPT for some time, but this feature sound really sweet!


You have to enable this feature under the Features menu.




Then you can change the voltage in the power section for GFX and  SoC. 
Note this is MPT v1.3.8b1. 
If you have much older version than this feature is not there. 
It was introduced at some point by HELM from Igorlabs and the good members at Hardwareluxx discovered what it does. 

You need to really cool your GPU well otherwise it heats up very quickly when you enable this and your score is gone.


----------



## chris189 (Feb 16, 2022)

@Felix123BU I couldn't find anything like that in MPT.  Advice?

After changing what you said, I still see max 1150mV Core Voltage in Wattman.  I need maybe 1175mV & I'll be happy.  I set vMin to 1200 on GFX min/max.

Thanks

*Update*
Nevermind I set vMin min/max to 1200 but set the new voltage from 1150mv to 1175mv & it worked!  Thanks

*Update*
It gimped it though, its not going past 500Mhz under load.  I seen this before if try & increase the voltage it does this.


----------



## chris189 (Feb 17, 2022)

Where is the temperature limit in Wattman AMD Radeon Software?  I'd like to limit the temperature of the Core & Hotspot to 84°C.​


----------



## skizzo (Feb 17, 2022)

chris189 said:


> Where is the temperature limit in Wattman AMD Radeon Software?  I'd like to limit the temperature of the Core & Hotspot to 84°C.


never heard or seen such a thing before, it doesn't exist. at least not in their official software. maybe there are user made/third party tools/mods that allow stuff like that to be configured, but I don't know.

with their official software you can limit, CLOCKS, POWER, and/or VOLTAGE, which impacts temps. but it's not like they offer users the ability to say "GPU never goes above X*C"


----------



## Felix123BU (Feb 17, 2022)

skizzo said:


> never heard or seen such a thing before, it doesn't exist. at least not in their official software. maybe there are user made/third party tools/mods that allow stuff like that to be configured, but I don't know.
> 
> with their official software you can limit, CLOCKS, POWER, and/or VOLTAGE, which impacts temps. but it's not like they offer users the ability to say "GPU never goes above X*C"


Yes, there is no such limit to be configured in Wattman, as far as I know also not in MPT, there is a upper max limit from where the GPU will start to throttle, but I don't know it to be configurable, and that's only as a protection limit, might have weird effects if set a lot lower. 

But also I can deduct that if there is such a value in the GPU bios, its only a matter of time until the MPT guys who are doing a fantastic job, figure it out and expose it to the user, if its not there already in some shape.

You can sort of achieve something similar by setting a custom fan profile so that effectively the fans keep the card close to a desired temp.


----------



## chris189 (Feb 17, 2022)

If you watch my videos.  The 1st one is undervolted down to 1000mV where *the core hits like 57°C & the hotspot hits 72°C*.  Then the 2nd video shows overclocked at stock voltage at 1150mV temps *Core @ 72°C & hotspot @ 102°C*.

You can't set the Target Temperature Lower than 60°C or it messes up Wattmans fan profile.  So I just set the Stop Zero RPM fan to 40°C & start to 45°C & so as soon as it hits 45°C the fan goes full speed.


----------



## nolive721 (Feb 20, 2022)

chris189 said:


> If you watch my videos.  The 1st one is undervolted down to 1000mV where *the core hits like 57°C & the hotspot hits 72°C*.  Then the 2nd video shows overclocked at stock voltage at 1150mV temps *Core @ 72°C & hotspot @ 102°C*.
> 
> You can't set the Target Temperature Lower than 60°C or it messes up Wattmans fan profile.  So I just set the Stop Zero RPM fan to 40°C & start to 45°C & so as soon as it hits 45°C the fan goes full speed.
> 
> View attachment 237075


@chris189

I am very interested in your testing both UV and pushing the envelop of temperature control

my best effort for my 6600xt is UV core 2200mhz at 1.062v and card runs in low 50 core and high 50s junction temps

at default it’s 10degc up on both and OC 2800mhz core 2180mhz fast timing m’en with voltage at 1.15v I see temps at around 65degc. Core and 80degc junction 

my card seems not to be catching MPT changes as I can’t push the core and frequency higher even increasing TDP and voltage which is strange

maybe it’s because I have very good experience with my Polaris and Vega previous GPUs and I am hoping too high I don’t know

in any case I love how Rdna2 efficient is with both power, temps and noise


----------



## chris189 (Feb 20, 2022)

@nolive721 Check out my 1000mV undervolt 2600mhz core 2200mhz memory fast timings video.  Check out the stats.  lmk

PS - 1000mV is unstable at 2600mhz so 2500-2550Mhz is optimal @ 1000mv.


----------



## nolive721 (Feb 20, 2022)

chris189 said:


> @nolive721 Check out my 1000mV undervolt 2600mhz core 2200mhz memory fast timings video.  Check out the stats.  lmk
> 
> PS - 1000mV is unstable at 2600mhz so 2500-2550Mhz is optimal @ 1000mv.


thank you for sharing

there is no way in the world my Nitro+ can boost at its rated Game Fqy with 1V. If yours can achieve that, you have an uber Gold chip indeed or there is some MPT tuning you are applying that I am completely missing here maybe?
But even so, I have not seen an 6600XT that could run these frequencies at that low of a voltage

@chris189

Please ignore my previous posts in terms of UV and Core Fqy,I was referring to input Voltage when I said my card does 1.062V stable.

With that 1.062V input, what I can achieve stable with 1V used by the card is 2485Mhz core 2175Mhx Fqy fast timing so of course better but really nowhere near yours


----------



## chris189 (Feb 20, 2022)

@nolive721 I think the Hotspot is on the Core Diode itself on the NAVI cards, wither its RDNA 1 or RDNA 2.  At 1000mV Core Voltage, which yields 0.968V max & 60°C Core & 82°C Hot Spot.
I'll be replacing my fans soon, the 3x 80mm fans with 4x 70mm Deltas for 250 CFM total which I hope will reduce the temperature of the Hot Spot.
This is 2575Mhz CORE CLOCK & 2200Mhz MEMORY CLOCK @ FAST TIMINGS.

_Update_ The thermal glue is making it run hot as heck!
horrible performing thermal glue

*UPDATE*
See the little blue squares?  That's Laird T-Flex 6100 Thermagon thermal pad's that transfer heat from the video card PCB (Printed Circuit Board) to the Heatsink which is cooled down with Fan's.  It's amazing how cool it's cooling now.

Rise Of The Tomb Raider 4K load Hwinfo temperatures Gigabyte Eagle AMD Radeon RX 6600 XT





















Here's my stock results after adding a shim to the core but man that memory is too hot with the current thermal pads I have installed & at 1.4v memory voltage.














Here's my results from a 4K Forza Motorsport 7 session 3700X @ 4.7GHZ & Gigabyte Eagle AMD Radeon RX 6600 XT @ 2575MHZ Core 0.988V & 2200MHZ Memory Fast Timings with 1.25V Memory Voltage

56°C CORE 0.988V
68°C HOT SPOT
71.9°C MEMORY UNDERVOLT TO 1.25V from 1.35V





I thermal adhered a copper shim to the heatsink as to increase core pressure & used M2 x 7mm screws Wafer Head (to be but not at the moment) using regular M2 x 7mm screws & Using Arctic Silver Ceramique 2 on the core itself.  This is the Gigabyte Eagle AMD Radeon RX 6600 XT.


----------



## chris189 (Feb 25, 2022)

I ordered the 4x 250 cfm total airflow 70mm Deltas & I need to order the VGA 4-pin to PWM adapter & PWM Splitters.  I'm sure it will help lower the Hot spot down.

Those blue squares are Laird Thermagon T-Flex 6100 pads that cool exceptionally well.


----------



## nolive721 (Feb 25, 2022)

That’s what I call enthusiast 

nice to see what you are doing here!


----------



## chris189 (Feb 25, 2022)

@nolive721 Thank you so much yes I love this card a lot, too bad its damaged a little from the previous owner who mined on it.  The screen flashes on a continuous basis randomly here & there.  I think I need to bake it, 350°F for 15min, the core & memory might fix the issue but idk its a little risky but I think it would work.  I can't wait to get all my parts in & post the results!


----------



## Felix123BU (Feb 25, 2022)

chris189 said:


> @nolive721 Thank you so much yes I love this card a lot, too bad its damaged a little from the previous owner who mined on it.  The screen flashes on a continuous basis randomly here & there.  I think I need to bake it, 350°F for 15min, the core & memory might fix the issue but idk its a little risky but I think it would work.  I can't wait to get all my parts in & post the results!


I would recommend caution with that, I did it once with a r290 with similar symptoms , it got even worse


----------



## Butanding1987 (Feb 26, 2022)

chris189 said:


> @nolive721 I think I need to bake it, 350°F for 15min, the core & memory might fix the issue but idk its a little risky but I think it would work.


DON'T DO IT.


----------



## chris189 (Feb 26, 2022)

By the way do you guys think that the card can handle the 12V current of 4x 70mm DELTA AUB0712VH @ 0.56A each or 0.60A max or 7.2W max for 2.4A & 28.8W of fan power.

They provide 62.5 CFM nominal each for 250 CFM total airflow with all 4 at full tilt.


----------



## chris189 (Mar 1, 2022)

I got the Delta's in & I could only hook up 2 of the fan's but the hot spot only hit 79°C with the clock at 2750Mhz, not bad.


----------



## chris189 (Mar 1, 2022)

Check out the stats of just 1 of the 70mm Deltas running with HWINFO 975mV Core & 1250mV Memory on Gigabyte Eagle AMD Radeon RX 6600 XT with Only One Delta 70mm running.

These are the new pads I'm using : THERMAGON LAIRD A10095-05 Silicon Thermal Pad Gap Conductive Insulation Pad / 36


----------



## pavle (Mar 1, 2022)

chris189 said:


> By the way do you guys think that the card can handle the 12V current of 4x 70mm DELTA AUB0712VH @ 0.56A each or 0.60A max or 7.2W max for 2.4A & 28.8W of fan power....


I don't think the card's fan circuit is good enough for 4 Deltas, would have to check specifics of the fan controller circuitry... Perhaps connect them unregulated on molex connector either on 5V (red+ black-) or on 7V (yellow+ red-) to avoid noisiness.


----------



## chris189 (Mar 1, 2022)

@pavle Surprisingly enough they aren't that noisy.  I wish I had the rest of my cables to hook up the rest of the fans.

Stock fans = 12v 0.35a = 4.2w each for 3 total at 12.6w total 1.05a

Delta fans = 12v 0.56a = 6.72w each for 4 total at 26.88w total 2.24a


----------



## pavle (Mar 2, 2022)

I see, well that's still more than twice the original amperage, I'm not sure AMD or Gigabyte left that much reserve on the fan controller circuitry. One molex connector will suffice for those 4 deltas, but you need to make the connections from it to + and gnd to all 4 fans. You'll need some wire and old snippets from resistors and/or other components that are just the right size to fit into fan connector and a soldering iron.


----------



## chris189 (Mar 2, 2022)

@pavle If only I was able to solder where I stay now.  I can however buy a PWM 4-way splitter to Molex for all 4 fans.

So your saying I should not even try it because it could likely burn out the fan controller?  That sure would suck... lol


----------



## pavle (Mar 2, 2022)

I tried searching for the fan controller but found none, haven't gone in-depth though but I'd say molex is a better idea.


----------



## chris189 (Mar 2, 2022)

Thanks.  Here I found this...


----------



## pavle (Mar 2, 2022)

This one probably has 12V. Depending on noise level you could rearrange those two connections to lower voltages to less noise, should still keep the card cool.


----------



## chris189 (Mar 2, 2022)

@pavle The card needs 12v continuous under load to keep the hot spot under control.

Can you see the fan controller chip with this picture?


----------



## pavle (Mar 2, 2022)

Looks like the fan controller is the little 4-pin chip on the right side of the fan connector with 2 transistors below the chip (black squares with 3 pins soldered to the board) but I could be wrong.
Anyway if those transistors are for fans I don't see them capable of passing much current...


----------



## chris189 (Mar 2, 2022)

Well I hooked it all up anyway because I couldn't bare to wait.  It's working fine but they don't quite achieve maximum rpm.  Just about 6,000 rpm max vs 6,150 rpm with a single fan hooked up but it's all good.

It dropped my hot spot in 3dmark Time Spy by -21°C, from 106°C to 85°C which is amazing.  It's also amazing how hot the RAM gets on the GDDR6.











AMD Radeon RX 6600 XT video card benchmark result - AMD Ryzen 7 3700X,ASUSTeK COMPUTER INC. ROG STRIX X570-I GAMING (3dmark.com)




Modded Gigabyte Eagle AMD Radeon RX 6600 XT 4.5Ghz 3700X 4K ULTRA Rise Of The Tomb Raider - YouTube


----------



## pavle (Mar 3, 2022)

Sweet! Looking good and cooling fiercely! 
But wow, 6000RPM, mighty fine if the noise isn't excessive.


----------



## fullinfusion (Mar 3, 2022)

I still want a reference card but even AMD(dot)com sells the 6700XT for over a grand.


----------



## Dropler (Mar 6, 2022)

Hello everybody. I was wondering if anybody could help me with some information about thermal pad thickness on the Red devil 6900 xt. Im going back to air cooled and the old thermal pads have dried completly so i have to apply some new ones. I have purchased some Thermal grizzly minus 8 1.5mm pads but i discovered there is a couple chips that need a thicker pad. Do anybody know which thickness is required for the chips with red circle around? The aircooler is not the same height here as the other spots.


----------



## chris189 (Mar 7, 2022)

@Dropler I would measure the heatsink, the difference between the core contact area to those spots you circled with a simple ruler in millimeters.

For me, right now I used something like 0.5mm pads all around & my memory is running a tiny bit warm hitting 80°C overclocked & over volted when before I think they hit 66°C on thick pads *so today if you encourage me I'll take the card apart again & fix the memory pads.*

Thanks


----------



## pavle (Mar 8, 2022)

Chris, please post the results of the thicker pads, are they helping or is more material just acting as an insulator and temps. are the same?


----------



## Taraquin (Mar 8, 2022)

Consider buying a RX 6600\XT for a budget build, but prefer afterburner for undervolt over adrenaline (stable profiles on RX 480, 580 and 5700XT kept resetting), is it possible to use afterburner for curve editing?


----------



## chris189 (Mar 8, 2022)

I'm having major thermal issues now all of a sudden, I was thinking my vRAM was a little hot by 10°C or so I decided to take it apart to check it out & ever since it's been running too hot.

I undervolted the vRAM from 1350mV down to 1100mV & its stable at stock memory clocks / fast timings & much cooler module's.

Why is it so hot?  How do I fix it?





This screenshot is prior to now with the thicker pads but now I prefer the appearance of the thinner pads so I'm using the thinner pads on the vRAM.





So I found out the PCB got stretched from the shim & caused it to not contact at all without a shim installed now.  So I decided to try out my AMD Radeon RX 480 X-Clamp on my Gigabyte Eagle AMD Radeon RX 6600 XT & it fits & dropped temps by 23°C on the Hot Spot but the Hot Spot is still hotter than before, but i think it will take some time for the X-Clamp to flex the PCB back to conformity.  What are your thoughts? 

PS - Where can I buy a RX 480 X-Clamp since I need to buy a couple.

Thanks











Here's my POST-INITIAL RESULTS, about 5-15min after initial results.  Already the PCB is conforming bros!





So it was working good until I restarted & now it won't post or get detected on a Gigabyte GA-A55M-S2V with AMD A6-3670K or on my AMD Ryzen 7 3700X rig.  I pulled it apart yet again & booted up without the heatsink to see if the core got hot & it got hot like it was working so idk guys.  Why isnt it getting detected?  It's not shorting out or maybe the card is bad now idk.


----------



## pavle (Mar 9, 2022)

Well that's sad. With long and heavy cards I always "hang" the right end with some zip-ties to my case (e.g. my GTX980 in pic.) to bend PCB as little as possible.
Regarding to pads I'd say, they can be considered plastic/insulator (though they aren't) so have them as thin as possible, as long as all components that got them properly touch the heat-sink.
Those X-clamps have been used on ATI cards since Radeon X1800 XT if it wasn't on your card originally, just leave it as it is and take care of PCB so it's near straight.


----------



## chris189 (Mar 9, 2022)

My card has two problems, it won't post video & the core doesn't touch the heatsink when tightened down all the way.  So I put it back together stock & started the RMA process.  I hope it gets repaired or replaced because I was loving the RDNA 2 performance.


----------



## Butanding1987 (Mar 10, 2022)

chris189 said:


> My card has two problems, it won't post video & the core doesn't touch the heatsink when tightened down all the way.  So I put it back together stock & started the RMA process.  I hope it gets repaired or replaced because I was loving the RDNA 2 performance.


I can't believe you're RMA-ing your card after fu**ing it up and boasting about improvements earlier.


----------



## chris189 (Mar 10, 2022)

@Butanding1987 I bought the card from some guy who mined on it (the guy said it worked perfectly though & didn't mention it was damaged)& damaged the card before I bought it plus it was working perfectly(minus the occasional flicker & screen corruption), until I restarted & that's all she wrote bro.  It was working fine, never overheated or anything before restarting.  It was hitting 55-60°C on the core & about 80°C on the hot spot before I restarted, then I restarted & it didn't post anymore.  So it's not my fault, it's the cards fault bro.


----------



## pavle (Mar 10, 2022)

Oh my so it was use for mining? Nasty and that it even flickered. Well I hope you get to RMA it because every mining card has a tired memory and prone to failure, despite some saying it's ok to buy a miner card (it would be good if memory wasn't at tight timings and never heated up to more than 50°C).


----------



## ThrashZone (Mar 14, 2022)

Hi,
Nice they allow warranty transferring.


----------



## GoldenX (Mar 14, 2022)

Got a https://www.sapphiretech.com/en/consumer/pulse-radeon-rx-6500-xt-4g-gddr6

The good:

Got it for cheap, and boss paid for it.
Looks nice.
Sips power.
Zero RPM fans at idle.
Good heatsink, low noise and temps under full load.
The bad:

Vega video decoder, it loses frames watching Youtube. 5 years and they never fixed this sh*t.
Low performance when bandwidth gets saturated.
Related to the previous point, performance is only consistent if you give it all the goodies, PCIe 4.0 and SAM.
Ray tracing performance is as bad as a 1660 SUPER, a card that has no RT units.
The Ugly:




Clock management is complete total useless trash.


----------



## chris189 (Mar 18, 2022)

Thanks.  So do you know where I can buy a RX 480 X-Clamp?  The card needs one bad when I get the new card in to get uniform core pressure on the heatsink to avoid damaging the core with too hot of a Hot Spot.  I'm going to baby the card when I get it back from RMA.  I may not even open it up at all & just leave it stock & just undervolt it possibly, it depends...


----------



## pavle (Mar 18, 2022)

I do not know where you could buy an X-clamp, perhaps try one of the older cards' clamp, e.g. HD 2900 XT or similar, they had a big core and such a clamp (if it's of the same dimensions).
But foremost when you get the card from RMA process, just leave it as it is and do not open it up if the temperatures are good enough.


----------



## chris189 (Mar 18, 2022)

Thanks @pavle the only problem with stock hot spot performance is it hits 106-110°C overclocked on the gigabyte eagle cards.  With an x-clamp installed from the rx 480 it achieves 83°C hot spot with max overclock with the delta fans of course so its a trade off.

I think I will leave it stock & talk about it with u guys for a while & we will vote on it if I should pull it apart or not to fix the temperatures or just bare the high hot spot temps & runs reduced underclocks with reduce undervoltages.


----------



## Fatal Fighter (Mar 31, 2022)

Guys, is it possible that Gigabyte sells coolers of their cards separately? I need this one's cooler https://www.techpowerup.com/review/gigabyte-radeon-rx-6900-xt-gaming-oc/2.html, as my watercooled  card is very difficult to be sold in my country


----------



## Garlic (Apr 24, 2022)

New card


----------



## kapone32 (Apr 24, 2022)

Fatal Fighter said:


> Guys, is it possible that Gigabyte sells coolers of their cards separately? I need this one's cooler https://www.techpowerup.com/review/gigabyte-radeon-rx-6900-xt-gaming-oc/2.html, as my watercooled  card is very difficult to be sold in my country


I am putting a block on one of those cards next week. You can get the shroud if shipping won't be too expensive.



chris189 said:


> Well I hooked it all up anyway because I couldn't bare to wait.  It's working fine but they don't quite achieve maximum rpm.  Just about 6,000 rpm max vs 6,150 rpm with a single fan hooked up but it's all good.
> 
> It dropped my hot spot in 3dmark Time Spy by -21°C, from 106°C to 85°C which is amazing.  It's also amazing how hot the RAM gets on the GDDR6.
> 
> ...


This is insane but it also shows how flexible PCs are


----------



## Garlic (Apr 25, 2022)

Hey guys, is there anyway to increase core voltage on my toxic that doesn’t require me taking it apart.. thanks


----------



## Fatal Fighter (Apr 25, 2022)

kapone32 said:


> I am putting a block on one of those cards next week. You can get the shroud if shipping won't be too expensive.


Thank you so much! 
But I decided to try once again and sell the card here in my country, for about 1300$.
Is the price too high?


----------



## keento (May 1, 2022)

Do you know why I cant use "more power tool" softwar with my rx 6600 ?


----------



## Kabouter Plop (May 29, 2022)

Anyone else has this scrolling thru animated backgrounds inside steam ?








Only happens inside steam not in firefox and only with hardware acceleration on for web pages, i dont think my gpu is dying its happening to consistently resizing window makes it less glitchy tho but does not go away.


----------



## Athlonite (May 30, 2022)

keento said:


> Do you know why I cant use "more power tool" softwar with my rx 6600 ?


did you read the how to on the website for it


----------



## chris189 (Jun 2, 2022)

@keento Did you figure it out.  It's quite easy.  Dump your BIOS file with GPU-Z & load it into the tool, then you can dial er' in bro.


----------



## keento (Jun 2, 2022)

@chris189 

yes i figured out

but it didnt help

i've read it doesnt really change the gpu bios but registry instead


----------



## jerseydevil24 (Jun 6, 2022)

I have msi rx 6600 with power limit 0 % + , gpu 54 degrees 2670 mhz, watt 100 max, has someone can tests after modifying the bios? 120 watt and power limit of 20% efficiency increment will be greater? in the adrenaline drivers gpu limit mhz 2900 mem 1900 mhz but it does not go in games in this mode ?


----------



## tpu7887 (Jun 10, 2022)

Got my RX 6600 XT yesterday 

Image below: HWiNFO 64 "Maximums"




So I have a 6600 XT and I've overclocked it a bit because why not? During a physics benchmark my GPU's hot spot temperature got up to 102 deg C while the GPU temperature's maximum was only 69. This seems like a big difference to me. At idle the difference is only 3 deg C. Unigine Heaven Benchmark 4.0 maximum temperature (GPU) is 63 deg C, with the hotspot at 83. The card's usual TDP is 135 watts which I increased to 162. This isn't to run the card at 162 watts, I've adjusted the clocks so that the average stays well below 135, the increased maximum is mainly there to increase performance when handling complex scenes, so the framerate doesn't drop below my monitor's VRR minimum.

Is it likely my card will hit these high temperatures again during real world use? I assume it's not safe for the card to be hitting over 100C. Once I set things up to use I don't check on them routinely. I don't want to end up with a broken or damaged card


----------



## Athlonite (Jun 11, 2022)

That's quite a big difference between hotspot and package temp on that 6600XT which one do you have specifically


----------



## tpu7887 (Jun 12, 2022)

Athlonite said:


> That's quite a big difference between hotspot and package temp on that 6600XT which one do you have specifically



It's the Gigabyte 6600 XT Gaming OC 8G. I've done more testing - It's only during synthetic benchmarks that the difference is is over 30 degrees - games are ~14-18


----------



## Athlonite (Jun 12, 2022)

tpu7887 said:


> It's the Gigabyte 6600 XT Gaming OC 8G. I've done more testing - It's only during synthetic benchmarks that the difference is is over 30 degrees - games are ~14-18


Ah well that's not so bad then


----------



## Kabouter Plop (Jun 18, 2022)

wattman is reseting upon safe shutdown sometimes now, i heard this is an old bug that returned and that disabling fast startup fixes it.


----------



## kapone32 (Jun 18, 2022)

tpu7887 said:


> Got my RX 6600 XT yesterday
> 
> Image below: HWiNFO 64 "Maximums"
> View attachment 250546
> ...


It is likely that your card does not have enough Thermal Paste on the GPU die and sub optimal thermal pads. The funny thing is Gigabyte GPUs are about the easiest to take apart. If you have never done it before watch several videos on Youtube on taking it apart before trying. You will need to use a good thermal paste and you may want to replace the thermal pads as well as they are so hot that they meld to the memory chips. Use good thermal paste like Noctua NTH2, MX-5 or some equal. 

I have had to do this with every Gigabyte card I have bought since the Vega 64. I just bought a 6500XT and was experiencing the same thing as you (triple fan cooler). When I took the card apart even the super small die for the 6500XT did not have enough paste and the pads had disintegrated to some degree.


----------



## tpu7887 (Jun 20, 2022)

kapone32 said:


> It is likely that your card does not have enough Thermal Paste on the GPU die and sub optimal thermal pads. The funny thing is Gigabyte GPUs are about the easiest to take apart. If you have never done it before watch several videos on Youtube on taking it apart before trying. You will need to use a good thermal paste and you may want to replace the thermal pads as well as they are so hot that they meld to the memory chips. Use good thermal paste like Noctua NTH2, MX-5 or some equal.
> 
> I have had to do this with every Gigabyte card I have bought since the Vega 64. I just bought a 6500XT and was experiencing the same thing as you (triple fan cooler). When I took the card apart even the super small die for the 6500XT did not have enough paste and the pads had disintegrated to some degree.



Are you sure it's a thermal paste issue? Because the GPU temperature didn't rise to above 70 deg C and the fans were nowhere near their maximum yet. Having a >30 degree variation from one spot on a chip to another seems more like a problem from a lot of current going through a small part of a chip. Though GPUs are pretty big and don't have IHS like CPUs do, so I get where it could be a paste issue.


----------



## kapone32 (Jun 20, 2022)

tpu7887 said:


> Are you sure it's a thermal paste issue? Because the GPU temperature didn't rise to above 70 deg C and the fans were nowhere near their maximum yet. Having a >30 degree variation from one spot on a chip to another seems more like a problem from a lot of current going through a small part of a chip. Though GPUs are pretty big and don't have IHS like CPUs do, so I get where it could be a paste issue.


If it is not the GPU die then it could be the thermal pads on the memory. The thing though is that the cooler on that card is a triple fan variant so the GPU should not even be going to 70. It is not the IHS but the amount of Thermal Paste that is applied by Gigabyte from the factory. Like I said I have a 6500XT which is about 1/6 the size of a 6800XT and was seeing the same temps until I changed the Paste and pads. Now it maintains 60 C and the hot spot is around 69-75.


----------



## GerKNG (Jun 20, 2022)

my collection so far.
Super happy with the performance and overclocking. but the reliability and drivers are still not on Nvidias Level. (Radeon Software unuseable until reboot, task bar flickering through sometimes, artifacting in R6 Siege since over a year (finger wide bar flickering across the screen for a few miliseconds every couple minutes)).

but still beside from just Raytracing Performance, RDNA2 won the current generation for me.


----------



## DoLlyBirD (Jun 20, 2022)

tpu7887 said:


> Are you sure it's a thermal paste issue? Because the GPU temperature didn't rise to above 70 deg C and the fans were nowhere near their maximum yet. Having a >30 degree variation from one spot on a chip to another seems more like a problem from a lot of current going through a small part of a chip. Though GPUs are pretty big and don't have IHS like CPUs do, so I get where it could be a paste issue.











						6700 XT Hotspot Temperature 95c anything to worry about?
					

Playing RE3 remake with RT on, max settings 1080p 100fps cap, the hotspot temp on my Sapphire Nitro+ is hitting 95-98c with 100% GPU load, GPU core temp was around 68%, coming from a 5600 XT I never saw the hotspot temp go above 80-85c with a manual fan curve, is this normal behaviour or should...




					www.techpowerup.com
				




Seems to be fairly normal for RDNA2 as I asked the same question and there are 2 pages of replies, just make sure your case airflow is adequate and set a custom fan curve on the GPU, apparently the hotspot can hit upto 110c without thermal throttling, after that you might see some so the GPU will literally lower the clocks and voltages as to not damage the GPU


----------



## tpu7887 (Jun 25, 2022)

kapone32 said:


> If it is not the GPU die then it could be the thermal pads on the memory. The thing though is that the cooler on that card is a triple fan variant so the GPU should not even be going to 70. It is not the IHS but the amount of Thermal Paste that is applied by Gigabyte from the factory. Like I said I have a 6500XT which is about 1/6 the size of a 6800XT and was seeing the same temps until I changed the Paste and pads. Now it maintains 60 C and the hot spot is around 69-75.




The cooler's fans were only spinning between 1750 and 1900RPM out of their maximum of 3300-3400RPM - it looks to me like 70 deg C GPU temperature was their target. 
If I was programming the fan profile, I probably would've done the same.
Worst case gaming - so no frame limiting + fully loaded card makes the hot spot 84-88 degrees _after _max power is increased from 130 to 162 watts?
Sounds very reasonable to me!



DoLlyBirD said:


> Seems to be fairly normal for RDNA2 as I asked the same question and there are 2 pages of replies, just make sure your case airflow is adequate and set a custom fan curve on the GPU, apparently the hotspot can hit upto 110c without thermal throttling, after that you might see some so the GPU will literally lower the clocks and voltages as to not damage the GPU



What was running on your card when you noticed a hot hot hotspot?
How high were your temps? (GPU/Hotspot)?


----------



## sam_86314 (Jun 25, 2022)

Here's my new toy.




















It was $687 before tax. Pretty close to MSRP, and more than I'd usually consider paying, but it's the exact model I've been wanting, and I'm in a pretty good place financially right now.

I'm currently running it on the Quiet BIOS, underclocked to 2250MHz @1100mV with the power limit set to the minimum. I dabbled with overclocking a bit and determined that the minimal gains I got in performance weren't worth the extra power usage and heat. Underclocking served me very well with my 5700XT, so I'm going to be doing it with this card too.


----------



## Splinterdog (Jun 28, 2022)

I'm seriously considering upgrading my Asus Rog Strix RX 5700 XT to RX 6800 XT for numerous reasons, not least the pain in the ass driver crashes and also the V Ram clock speed which I can't see ever being solved.
On the other hand, the RX 6800 XT performance appears to be nearly twice the 5700, which is reason enough  I'll put the 5700 in my second rig and sell the ROG Strix Rx 580, which is the plan to partly fund the new card.
The limited choice I have here is 
Asus TUF
Sapphire Nitro and Pulse
Asrock Phantom 
Gigabyte 
Colorful
Rog Strix water cooled is way out of my price range 
Any opinions on these cards?


----------



## Fatal Fighter (Jun 28, 2022)

Splinterdog said:


> Any opinions on these cards?


Just  do not get Gigabyte.
Owner of two Gigabyte RX6900XT Xtreme waterforce and waterblock on these is very low quality. That low that I had to order waterblock from Alphacool... Hot Spot difference is 30C+, thermal pads are cheap af.


----------



## Splinterdog (Jun 28, 2022)

I'm leaning towards the Asus TUF at the moment.


----------



## skizzo (Jun 28, 2022)

tpu7887 said:


> It's the Gigabyte 6600 XT Gaming OC 8G. I've done more testing - It's only during synthetic benchmarks that the difference is is over 30 degrees - games are ~14-18


OCing will make the delta greater. Both values (~30*C+ delta and ~18*C delta)  are normal considering circumstances.

My RX 5700 XT had a large delta too, and it was at least 30*C when OC and H20 cooled. 

My RX 6900 XT on the other hand does not. 

Both make sense and are in line with reviews. Always read reviews, there is nothing different being shared so no reason to investigate or worry about it further.




Splinterdog said:


> I'm seriously considering upgrading my Asus Rog Strix RX 5700 XT to RX 6800 XT for numerous reasons, not least the pain in the ass driver crashes and also the V Ram clock speed which I can't see ever being solved.
> On the other hand, the RX 6800 XT performance appears to be nearly twice the 5700, which is reason enough  I'll put the 5700 in my second rig and sell the ROG Strix Rx 580, which is the plan to partly fund the new card.
> The limited choice I have here is
> Asus TUF
> ...


Sapphire has been my brand choice for AMD cards, that or straight from the horses mouth, aka, AMD
Nitro is always their OC'd model, may require an additional PCIe power cable compared to other models as a result. Otherwise the Pulse models is what I personally go for if leaving on air. I would read reviews and get whichever one is quieter since performance is going to be super close I doubt you would notice in anything other than benchmarks. If you're going to H20 cool it, then go for performance (which means get the Nitro) since noise will be determined by your what I assume is an already existing loop setup, ie fans and pump


----------



## Splinterdog (Jun 28, 2022)

I won't be water cooling the card because I don't have room in the case what with a 240mm Corsair already in place at the top. Besides, water cooled cards with the rads already fitted are too expensive down here.
I've had Sapphire before as well and found them reliable, but I also like Asus. Just depends on what I can afford, but it must not be lower than RX 6800 XT, otherwise it's a sideways move.


----------



## GamerGuy (Jun 29, 2022)

@skizzo Agree wholeheartedly about Sapphire, and would like to expand it to include PowerColor and XFX. 

Basically AMD's exclusive AIB partners. Presently have a Sapphire RX 6900 XT Nitro+ and a PowerColor RX Vega 64 Red Devil.


----------



## tpu7887 (Jul 1, 2022)

skizzo said:


> OCing will make the delta greater. Both values (~30*C+ delta and ~18*C delta)  are normal considering circumstances.
> 
> My RX 5700 XT had a large delta too, and it was at least 30*C when OC and H20 cooled.
> 
> ...



I'm not worried, people kept answering way after I thought things were handled.

Random question for you (and @GamerGuy ): I always thought Sapphire cards weren't that great. Well, I got it in my head back in the early/mid 2000s and stopped paying attention to their stuff. I didn't ignore them, just never took them as serious contenders, certainly not to the extent "4090's are coming out, better find a retailer that stocks Sapphire at a good price".

Are they like Acer? If you don't know Acer, they're a company that makes a lot of different stuff - using laptops as an example of their product design philosophy early on as a younger company (mid 2000s to early 2010s): their laptops usually had the slowest processors, stupid models that barely cost less than somewhat capable ones, but they'd use them anyway. There usually wasn't enough RAM, and they'd use 2x1GB sticks instead of 1x2GB that could be easily upgraded to 4 by adding just one more 2GB stick. Not buying 2 and having 2x1s to throw out. They'd use 4200RPM hard drives with 2MB cache instead of 5400/7200 with 8/16. Their best product was a mid i5 (not the best i5) with a 45 W/h battery and the same crappy 1366x768 matte finish screen used throughout their barely differentiated models.

Now in 2022 they make a range of stuff, a lot that could be described as "good". People unaware of their past would see no problem with buying it, but me? I'm still wary. If I owned something of theirs, it'd always be in the back of my mind that they skimped on something I don't know about and my product might spontaneously fail. If it was a power supply for a very cheap product, I'd be concerned about leaving it plugged in when leaving home.
Because of what they so blatantly did in the past, I expect them to skimp on things that you can't see - most people buying their stuff didn't have a lot of money and/or weren't technically literate. I didn't have much money back then (still don't, really), but instead of buying one of their turds for $549, I'd wait until an Asus laptop went on sale to $600 from $1000, getting the model that was configured with 1x4GB RAM and 500 GB HD instead of 2x4GB RAM and 1TB HDD/120GB SSD for $1450. In a few months I'd grab another 4GB stick of RAM for $60 and a 250GB SSD for $150, bringing my total to just $810. The Asus configured the same cost $1800. I spent less than half, and that laptop is_ still_ capable - my mom's still using hers (mine fell off my bed 4 years ago and the screen cracked, I would've retired it 3 years ago).
Hers specs: 4200u or 4210u, can't remember, 8GB RAM, 500GB Samsung 850 Evo, Intel 7260 AC (original). Christmas 2013 or 14 to now. 8 years. That acer, I doubt it lasted 3. Her battery is original and gets 4-5 hours browsing with high screen brightness, like it did when new

Anyway... is Sapphire good now? Not something to spontaneously combust? Especially a video card with as much power as they take now...


----------



## Athlonite (Jul 1, 2022)

tpu7887 said:


> I always thought Sapphire cards weren't that great



For AMD they are probably the best out there I buy nothing but Sapphire Nitro+ cards and haven't had a problem with them


----------



## Garlic (Jul 4, 2022)

Hi guys, a few days ago I was messing with mpt with driver version 22.6.1. I have a sapphire toxic 6900xt with the XTX silicon. I noticed I can now increase vram clock limit and higher gpu core voltage without temp_dependent_vmin. I have talked to some people online but it does not seem to work for them, I suspect the Toxic's bios has something different from the rest to allow me to do this. However I did notice that even with increased vram voltage I could not get the vram to clock higher than around 2170-2180 without huge performance drop. The core clock was just fine and work perfectly though. Any feedback or comments are welcome. 

here are some photos.


----------



## Kabouter Plop (Jul 5, 2022)

Anyone else been using windows 11 22h2 driver install does scary stuff that carry over thru reboot, if seen others already report the same, just wonder if anyone here has given it a try yet ?


----------



## rapper sjors (Jul 10, 2022)

Hi guys,

Got a Asus Dual 6700xt OC edition to combine with my AMD3600+6700xt+16gb






I am noticing a temperature of 84°C with fansspeed on maximum and therefore alot of noise. Has anyone tried to succesfully tweak the noiselevel down a bit?


----------



## sam_86314 (Jul 12, 2022)

Recently did some comparison tests between my 6800 XT in my main system and my 5700 XT in my testing system.

















I'm blown away by the efficiency of RDNA2. The 6800 XT is only using about 20-30W more than the 5700 XT, and yet it performs nearly twice as well. 

Granted, the CPU in my testing system is much older than the one in my main system, and it doesn't have resizeable BAR or PCIe Gen 4. But I've found scores to only be about 10% lower than they were when the 5700 XT was in my main system.

Both cards are tuned for quiet operation. The 5700 XT is underclocked to 1850MHz @990mV, and the 6800 XT is underclocked to 2125MHz with no voltage change.


----------



## GreiverBlade (Jul 16, 2022)

late entry,  

got a Powercolor Red Devil RX 6700 XT 

couldn't be more pleased for a GTX 1070 upgrade 

initial run was low as hell even to i DDUed the previous driver, ran another DDU with more options and then updated the driver to the last one (strangely enough the drivers on TPU weren't the latest ) 

now, that's more like it! SOTTR 3K 60 fps? ohhhh i'm in love ... (1 year late with the launch but i already did wait 6yrs for a GPU upgrade ahah) 




oh well SOTTR 1080p score contest entry


----------



## Athlonite (Jul 17, 2022)

Kabouter Plop said:


> Anyone else been using windows 11 22h2 driver install does scary stuff that carry over thru reboot, if seen others already report the same, just wonder if anyone here has given it a try yet ?


More info on the "Scary Stuff" please if you don't mind I'm on 22h2 but haven't yet installed 22.6.1 as of yet


----------



## sam_86314 (Jul 17, 2022)

Not on Windows 11, but since updating to 22.6.1, I have been having a problem where the graphics driver crashes when I wake my PC from S3 sleep.

This wasn't an issue on any previous drivers with either this card or my 5700 XT.

It's really annoying because my underclock settings get reset and DP audio stops working until I open the sound control panel in Windows.


----------



## Kabouter Plop (Jul 17, 2022)

Athlonite said:


> More info on the "Scary Stuff" please if you don't mind I'm on 22h2 but haven't yet installed 22.6.1 as of yet



All hardware dropping out for like 30/60 seconds as if your hardware stopped being recognizing and mainboard malfunctioned, after 30/60 sec everything slowly pops back up, also the radeon driver errors, best way to install driver so far is driver only minimum install then follow up with full install avoid these errors, this only happens once, when i mean drop out i mean everything drops out even lan adapter and eventually usb as well, this persist on reboot as well where you mainboard insist gpu posted but you get no display until you power cycle.


----------



## GreiverBlade (Jul 17, 2022)

i had some freeze on the "latest" driver from here, but everything went back to, as expected, once i updated the driver via Adrenalin...
since then, zero issues, no freeze no performances loss

and i just went from a GTX 1070 (although i had to run DDU twice )


----------



## Valantar (Jul 17, 2022)

tpu7887 said:


> Got my RX 6600 XT yesterday
> 
> Image below: HWiNFO 64 "Maximums"
> View attachment 250546
> ...


A bit late to the party here, but to me that sounds like a cooler mounting pressure issue - which it's quite widely documented that some Radeon designs have (AMD seems to specify oddly low mounting pressure for third party designs). Quite a few people mod their cards by adding 1mm or .5mm nylon washers in between the PCB and cooler mounting screws on the back to increase mounting pressure, which often helps improve this. If this bothers you, that's worth a try, but if not, don't worry - it is perfectly safe for the card to be operating at these temperatures. If it wasn't, the card would scale back clocks to compensate. IIRC tJmax for Radeon hot spot temperatures is 110 or 115°C, so you're well within the limits.


----------



## Felix123BU (Jul 17, 2022)

Kabouter Plop said:


> All hardware dropping out for like 30/60 seconds as if your hardware stopped being recognizing and mainboard malfunctioned, after 30/60 sec everything slowly pops back up, also the radeon driver errors, best way to install driver so far is driver only minimum install then follow up with full install avoid these errors, this only happens once, when i mean drop out i mean everything drops out even lan adapter and eventually usb as well, this persist on reboot as well where you mainboard insist gpu posted but you get no display until you power cycle.


That definitely sound like a damaged card, hardware wise, I got the same on 2 different cards during the years, they shortly died completely after that. Hope yours lasts.


----------



## tpu7887 (Jul 19, 2022)

Valantar said:


> A bit late to the party here, but to me that sounds like a cooler mounting pressure issue - which it's quite widely documented that some Radeon designs have (AMD seems to specify oddly low mounting pressure for third party designs). Quite a few people mod their cards by adding 1mm or .5mm nylon washers in between the PCB and cooler mounting screws on the back to increase mounting pressure, which often helps improve this. If this bothers you, that's worth a try, but if not, don't worry - it is perfectly safe for the card to be operating at these temperatures. If it wasn't, the card would scale back clocks to compensate. IIRC tJmax for Radeon hot spot temperatures is 110 or 115°C, so you're well within the limits.



I've pretty much determined that it's not an issue - games don't result in a hotspot more than ~15 degrees hotter than the GPU temp. But nice trick! I might do it to my Z390 chipset. The thermal pad MSI chose to use decided to leak a bunch of oil all the time, and since I didn't have anything to replace it with I decided to just use thermal paste lol. Unfortunately for me my motherboard doesn't report PCH temperature, but I think it's fine because my IR thermometer reports the heatsink to be 52 deg C


----------



## kapone32 (Jul 19, 2022)

I had the the 22.6.1 driver installed and I was getting nothing but shutdowns. I keep my PC updated in terms of Windows and run Windows 11. I went back to 22.5.1 and it has been smooth sailing. There is obviously a conflict between Windows 11 and that driver. I am hoping that the next Driver update will resolve the issue.



Valantar said:


> A bit late to the party here, but to me that sounds like a cooler mounting pressure issue - which it's quite widely documented that some Radeon designs have (AMD seems to specify oddly low mounting pressure for third party designs). Quite a few people mod their cards by adding 1mm or .5mm nylon washers in between the PCB and cooler mounting screws on the back to increase mounting pressure, which often helps improve this. If this bothers you, that's worth a try, but if not, don't worry - it is perfectly safe for the card to be operating at these temperatures. If it wasn't, the card would scale back clocks to compensate. IIRC tJmax for Radeon hot spot temperatures is 110 or 115°C, so you're well within the limits.


I have an EK block on my 6800XT, I was getting that 20 degree difference in GPU vs Hotspot. I ordered some pads on Amazon that came with .5 to 2mm thick pads. I first used the 1.5 MM all over the GPU but that did not allow for a tight fit and my temps were in the 90s. I switched to the 1 MM pads on the front of the card and that allowed for a nice and tight fit. The card regularly idles in the low 30s (regardless of the VRAM clock) and regulary runs in the high 40s to low 50s when Gaming. I use a pretty costly (In Canada) Alphacool Pump/res and their quick connect tubing attached to a 420mm rad I took from an Eisbaer AIO. For me the 6800XT is the cat's meow and Freesync is much more impactful than Ray Tracing. I have said it before but my favourite Game to play is TWWH. Regardless of your GPU 4x4 20 unit battles with Ultra Unit size will have your Game running in the 40s and Freesync has been a blessing in that regard.


----------



## nolive721 (Jul 20, 2022)

I am terrible 

after buying a 6600, 6600XT and even a 6700Xt most recently, I have been bitten by the upgrade bug again and bought a RX6800 on Yahoo auction this week

that's a local Japan brand model but actually is the Powercolor Fighter underneath, it even shows the Powercolor brand name on the gpu shroud.  Got it for the equivalent of $400 and its under warranty still.









						KUROUTOSHIKOU Fighter RX 6800 Specs
					

AMD Navi 21, 2155 MHz, 3840 Cores, 240 TMUs, 96 ROPs, 16384 MB GDDR6, 2000 MHz, 256 bit




					www.techpowerup.com
				




I have done some testing last night and the card overclocks pretty well (206fps in Heaven 1080p)
I am running stable 2517Mhz core 2116Mhz fast timing Vram all at 1.025V
in heaven benchmark temps go up to  around 70degC max and that's with stock fan speed (very quiet by the way)

Undervolting is also pretty good with core 2249Mhz at 0.907V in some quick check done at the same time than overclocking

the only few concerns I have and I would need help here from users, if possible


there is a thin buzzing noise (coil whine?) that kind reminds me of my VEGA 64LC, is that common on such card?
I have no idea how to operate the BIOS Switch. GPUZ is telling me its on OC mode (2155Mz boost)  but pushing it doesn't bring it to the Quiet bios that is meant to be down to 2105Mhz
even in OC Bios and with Custom overclocking, the Power draw barely exceeds 200W (see snapshot). I might be confused by the card behaviour but TDP is meant to be 250W for this model so why such limitation shown on WATTMAN reporting here?




thanks a lot

Oli


----------



## nolive721 (Jul 21, 2022)

Anyone kind enough to advise me on these 3 concerns?


----------



## Athlonite (Jul 21, 2022)

nolive721 said:


> I might be confused by the card behaviour but TDP is meant to be 250W for this model so why such limitation shown on WATTMAN reporting here?


Actually it's only meant to be 225W TDP as the 6800's weren't meant to be highly OC'able the big OC'er was the 6800XT



nolive721 said:


> Anyone kind enough to advise me on these 3 concerns?


Sure what do have going on that you need help with


----------



## ratirt (Jul 21, 2022)

GreiverBlade said:


> late entry,
> 
> got a Powercolor Red Devil RX 6700 XT
> View attachment 254911
> ...


I so dig the red devils editions. Got one myself and it works great.  Had one previously. Tried the sapphire as well but the red devil is the one. 
It must have been a crazy jump for you replacing the 1070. I jumped from 5600xt Pulse to 6900xt and it was like night and day. 
Grats


----------



## rapper sjors (Jul 21, 2022)

My upgrade  was from a 960 2gb to a 6700xt.


----------



## mama (Jul 21, 2022)

What is the best water cooled GPU for a custom loop?  Thinking about a 6950XT and assume that any water block for a 6900XT will fit but am I wrong about that?


----------



## GreiverBlade (Jul 21, 2022)

nolive721 said:


> that's a local Japan brand model but actually is the Powercolor Fighter underneath, it even shows the Powercolor brand name on the gpu shroud.  Got it for the equivalent of $400 and its under warranty still.


indeed normal, KurouToshikou is the Powercolor Japan market variant from TUL, which hold Powercolor as you mentioned and also Club3D, VTX3D, Sparkle and Visiontek (that last one seems shared with HIS), 400$ nice find 


ratirt said:


> It must have been a crazy jump for you replacing the 1070. I jumped from 5600xt Pulse to 6900xt and it was like night and day.
> Grats


yep the jump was a refreshing breeze and i did choose the Powercolor Red Devil initially but it was way too overpriced in Switerland, a week before buying it it was still 800chf and then the week i got interested in buying it 650chf and the day i decided to order : BAM special promotion at 450  (~31% reduction? or my math is dead done and gone?)

design wise, it's literally one of the best looking card i ever had (heavy tho  i might get a GPU rig, but for now no sagging ).


----------



## nolive721 (Jul 21, 2022)

GreiverBlade said:


> indeed normal, KurouToshikou is the Powercolor Japan market variant from TUL, which hold Powercolor as you mentioned and also Club3D, VTX3D, Sparkle and Visiontek (that last one seems shared with HIS), 400$ nice find I so dig the red devils editions. Got one myself and it works great.  Had one previously. Tried the sapphire as well but the red devil is the one.
> 
> yep the jump was a refreshing breeze and i did choose the Powercolor Red Devil initially but it was way too overpriced in Switerland, a week before buying it it was still 800chf and then the week i got interested in buying it 650chf and the day i decided to order : BAM special promotion at 450  (~31% reduction? or my math is dead done and gone?)
> 
> design wise, it's literally one of the best looking card i ever had (heavy tho  i might get a GPU rig, but for now no sagging ).


wow you seem super knowledgeable about the brand. I actually have a PSU SFX from them in my NR200P case and it has been working well for a year or so, the card itself had positive review although considered as a low cost variant so I didn't hesitate much in the end.
I had a RX480 RED DEVIl back in the days and I loved it but just to clarify, the card I bought is the POWERCOLOR FIGHTER model and I won it at an auction so its 2nd hand product.
Still I see it as a great value considering GPU market prices (new) being still high in Japan



Athlonite said:


> Actually it's only meant to be 225W TDP as the 6800's weren't meant to be highly OC'able the big OC'er was the 6800XT
> 
> 
> Sure what do have going on that you need help with


Ok thanks for the reply but I am still confused as Techpowerup database itself is showing a 250W TDP






regarding my questions, maybe I should have given more details then, sorry about that.
so to develop a bit further



there is a thin buzzing noise (coil whine?) that kind reminds me of my VEGA 64LC, is that common on such card?
the noise more of a intermittent zzz...zz.zzzz..zzz electrical noise rather than coil whine type. so is that typical from RX6800 or do I have to worry about the longevity of the card?

2. I have no idea how to operate the BIOS Switch. GPUZ is telling me its on OC mode (2155Mz boost) but pushing it doesn't bring it to the Quiet bios that is meant to be down to 2105Mhz

I remember owning a RED DEVIL GPU with physical BIOS switch that you had to flip, then reboot the PC and the related BIOS mode was active at restart in Windows.
here I have tried the same way and GPUZ is telling me that its still on the OC BIOS even if I flip the switch to Quiet mode, reboot etc so I am confused at what is the correct process to switch BIOS over

3.even in OC Bios and with Custom overclocking, the Power draw barely exceeds 200W (see snapshot). I might be confused by the card behaviour but TDP is meant to be 250W for this model so why such limitation shown on WATTMAN reporting here?

Ok the GPU preforms as it should. I mean Clock wise and fps wise, its matching the Reviews I have read about the RX6800 so I am happy. and actually it does UV and OC pretty well too as I mentioned in my original post. What is confusing me here as well is that the power draw is  not dramatically increasing even at Full voltage and with 10% power limit increase activated.
I have done a sanity check with HWINFO and it gives same Wattage reporting than Wattman so I am really curious to understand what could be happening? is that related to this BIOS thing I am mentioning in my point Nr2? something else?


----------



## Athlonite (Jul 21, 2022)

Ah I see what you mean there about the TDP Wattage and in a way they're right if you use the 225W standard + 10% = 247.5W (rounded up to 250W) but that's the absolute max you'll more than likely never see that I never do on my RX6800
This is Heaven benchmark power draw max's out at 207W





and even if I crank the OC up to max it still only creeps upto 220W 

as to the coil whine when does it actually happen is it during normal game play or is it only in those really high FPS areas like menus or is it happening all the time no matter what you're doing 

with flipping the BIOS switch it's recommended to shut down the PC then flip the switch and reboot unless you have software like Sapphire's Trixx which allows me to switch it in software and then reboot to activate it


----------



## nolive721 (Jul 22, 2022)

Athlonite said:


> Ah I see what you mean there about the TDP Wattage and in a way they're right if you use the 225W standard + 10% = 247.5W (rounded up to 250W) but that's the absolute max you'll more than likely never see that I never do on my RX6800
> This is Heaven benchmark power draw max's out at 207W
> 
> View attachment 255500
> ...


Ok let me digest all that in the weekend

your Heaven benchmark is very helpful to confirm TDP question 

can you share your core and men clocks just for comparison benefit?


----------



## Athlonite (Jul 22, 2022)

nolive721 said:


> Ok let me digest all that in the weekend
> 
> your Heaven benchmark is very helpful to confirm TDP question
> 
> can you share your core and men clocks just for comparison benefit?


at 2244MHz it pulls 223W @1025mV and +10% power limit but temps are crap GPU edge temp is 50c but the hotspot is 80c and it's holding back the clocks 
looks like I need to redo the thermal paste on the GPU die


----------



## nolive721 (Jul 22, 2022)

Thanks

your clocks seem low or mine high 
 maybe silicon lottery or I am still doubtful of Wattman reporting or behavior of my card

what is your Heaven 1080p score with such OC?
Mine is 206fps avg max around 415fps

thanks!


----------



## nolive721 (Jul 23, 2022)

nolive721 said:


> there is a thin buzzing noise (coil whine?) that kind reminds me of my VEGA 64LC, is that common on such card?
> the noise more of a intermittent zzz...zz.zzzz..zzz electrical noise rather than coil whine type. so is that typical from RX6800 or do I have to worry about the longevity of the card?
> 
> 2. I have no idea how to operate the BIOS Switch. GPUZ is telling me its on OC mode (2155Mz boost) but pushing it doesn't bring it to the Quiet bios that is meant to be down to 2105Mhz
> ...


ok TDP cleared,

now for whatever reason I have also managed to get the BIOS Switch to do something that seems right

After reboot, I now can see in GPUZ a boost clock of 2155Mhz for what is the QUIET bios switch position(and it shows 203W in MPT TDP cell) and if I switch to OC bios and reboot, GPUZ will show me 2190Mhz boost clock and MPT reporting TDP at 215W so its working as intended

now talking about MPT,I am going to give it a try with the card and report if any good to push my already decent OC a bit closer to a 6800XT stock perf.

wish me luck


----------



## Athlonite (Jul 23, 2022)

nolive721 said:


> Thanks
> 
> your clocks seem low or mine high
> maybe silicon lottery or I am still doubtful of Wattman reporting or behavior of my card
> ...


15:37:18 Time:    260.608
15:37:18 Frames:    53514
15:37:18 FPS:    205.343
15:37:18 Min FPS:    52.8846
15:37:18 Max FPS:    448.893
15:37:18 Score:    5172.58


----------



## nolive721 (Jul 23, 2022)

Thanks.  your score is aligned with mine then, yet I am surprised by your clocks that seem low to achieve such a score, you are sure that's 2244Mhz and for Core?


----------



## mechtech (Jul 23, 2022)

Splinterdog said:


> I'm leaning towards the Asus TUF at the moment.


Just recently got a 2nd hand 6800 Asus TUF.   In quiet mode BIOS, very quiet and temps low, but haven't had the time to really run it through its paces yet.
I even disabled fan stop to reduce temps more and its inaudible in the case.


----------



## Athlonite (Jul 23, 2022)

nolive721 said:


> Thanks.  your score is aligned with mine then, yet I am surprised by your clocks that seem low to achieve such a score, you are sure that's 2244Mhz and for Core?


yup that's the core clks I watched them during the bench and 2236~2245 was where they hung around


----------



## nolive721 (Jul 23, 2022)

ok thanks again for the reply, much appreciated.
 I think now I am going to put my brain at ease with Heaven benchmark result because with somehow similar clocks to yours and Memory OC (see snapshot), I am getting only 193fps in 1080p Extreme setting run.

I need to move to my Max OC which is 2517Mhz core to achieve same results than yours

l should not think too much and enjoy the card for what it is now, not even tried MPT since yesterday


----------



## FreedomEclipse (Jul 26, 2022)

I'll be joining shortly. Warm up a seat for me.


----------



## eyderr (Jul 26, 2022)

Hello guys!
I just installed a Sapphire Pulse Rx 6600. And I noticed the temperatures are high! Playing hot spot temperature is reaching 90ºC (hotspot) in some games. All in stock.

There are many reports on the internet that it is not good. What does the community recommend?
Should I be concerned?


----------



## Lew Zealand (Jul 26, 2022)

I have a Pulse 6600XT and my Hot Spot temps can reach close to 90ºC in stock config which I understand to be fine.  You want your edge temps to stay below 83-85ºC and with those Hotspots, you're edge temps are probably in the low-mid 70s which is good.  I prefer lower temps than this so I run the 6600XT with a 100MHz underclock (tops out at 2500 MHz) and I rarely reach higher than 64ºC in a 27ºC room (Hotspot in the mid 80s) but perhaps the cooler on the 6600XT is maybe a bit bigger than the 6600.


----------



## eyderr (Jul 26, 2022)

Lew Zealand said:


> I have a Pulse 6600XT and my Hot Spot temps can reach close to 90ºC in stock config which I understand to be fine. You want your edge temps to stay below 83-85ºC and with those Hotspots, you're edge temps are probably in the low-mid 70s which is good. I prefer lower temps than this so I run the 6600XT with a 100MHz underclock (tops out at 2500 MHz) and I rarely reach higher than 64ºC in a 27ºC room (Hotspot in the mid 80s) but perhaps the cooler on the 6600XT is maybe a bit bigger than the 6600.


I'm thinking of changing the fan speed. I don't care about noises. But I'm afraid of lowering the temperature and damaging the fans because they are running at very high speeds. What would be worse?
Hot spot 90º or Fans in 90%?
thinking about longevity


----------



## Lew Zealand (Jul 26, 2022)

eyderr said:


> I'm thinking of changing the fan speed. I don't care about noises. But I'm afraid of lowering the temperature and damaging the fans because they are running at very high speeds. What would be worse?
> Hot spot 90º or Fans in 90%?
> thinking about longevity



That's a real good question.  I don't focus on Hot Spot temps as I'm simply used to having access to only the regular GPU temps (the Edge temps).  Hot Spot was only made available for Nvidia GPUs recently in GPU-Z and you don't get them at all with Intel iGPUs (I've used these extensively in the past, only being a Radeon user for about a year and a half now).

I generally like my Edge Temps to be at ~65-68ºC, which in my 6400 and 6600XT means the Hot Spot is at about 85-88ºC under load.  However the Pulse 6400 runs at about 73ºC in my Dell 9020 thanx to mediocre case ventilation and you know what, I need to have a look at the edge temps.  But I'm perfectly fine with 73ºC even if the edges are at 90ºC.  Dammit, I'm curious now...

What I'd do is target below 90ºC running a typical load (I use Unigine Valley, though many people seem to like Heaven) on a loop in a window and monitor everything.  Turn up the fans in whatever software you use (I use Afterburner) to an acceptable level.

Here's a similar setup I used to have:  a crap cooler GTX 1080 that went to 78ºC in a cool room and over 80ºC in a warm one (so hotspots to 90+ºC).  So I undervolted and tuned it (0.9v, 1923 MHz cores, 11,500 MHz memory) so it used 130W instead of 180W.  Topped out at 68-74ºC in cool to warm room with fans ramping up to 65 to 75%, depending on temp.  IMO 100% success.

Find that happy medium on your card, as it doesn't need to be either 90% or 90ºC.  It could be 75% fans and 85ºC Hot Spot and that's a nice compromise.

BTW, after years of use I bought an aftermarket cooler for fun and that 1080 now runs flat out at 62ºC Edge temps at 2100 MHz.  But save those shenanigans for _much_ later.
__________

Update:

FWIW, Pulse 6400 in Dell 9020 MT after a 20 minute loop in Valley at 1080p High:

27ºC room
74-75ºC GPU Temp
81-82ºC GPU Temp (Hot Spot)
GPU fan at 68%, 4000 RPM.  But sounds like a normal GPU fan, not a buzzsaw so I wonder if speed is reported correctly?

Deltas are only 7ºC from die to Hot Spot in this design.  The Dell case fan is lazing along, I need to see how to get it working a bit faster to get rid of build-up case heat.  6600XT next...

Pulse 6600XT (@stock settings) in Coolermaster N200 after a 20 minute loop in Valley at Ultra:

27C Room
69-70C GPU Temp
85-89C GPU Temp (Hot Spot)
GPU fan variable, topping out at 54%, 1950 RPM, noise not much different than 6400

Deltas here are close to 20C from die to Hot Spot.  I'm not concerned.


----------



## FreedomEclipse (Jul 27, 2022)

My initial opinion of the card before purchasing was that it was going to be a bottom tier card that would run loud and hot because of cost cutting measures. However upon receiving the card and making a closer inspection. I place it closer to a an MSI Gaming X level of card as the heatsink has a substantial amount of meat on it, I dont think i'll need to try and move the Acelero Xtreme over but I'll know more about temps when I get it installed.

Some side by side shots with the now deceased 1080Ti:



From the side:



Its quite a wide card.

Its going to go in shortly.


----------



## Valantar (Jul 27, 2022)

eyderr said:


> I'm thinking of changing the fan speed. I don't care about noises. But I'm afraid of lowering the temperature and damaging the fans because they are running at very high speeds. What would be worse?
> Hot spot 90º or Fans in 90%?
> thinking about longevity


Hotspot temps at 90°C aren't going to affect your GPU's longevity _whatsoever_. Zero difference. Heck, we've all been running our GPUs that hot for decades - they just didn't have the thermal sensors to show us this before now. Fans at 90% on the other hand? That's going to wear out both their bearings and your eardrums. Failed fans are responsible for the _vast_ majority of GPU failures. Leave it.


----------



## Athlonite (Jul 27, 2022)

Valantar said:


> Hotspot temps at 90°C aren't going to affect your GPU's longevity _whatsoever_. Zero difference. Heck, we've all been running our GPUs that hot for decades - they just didn't have the thermal sensors to show us this before now. Fans at 90% on the other hand? That's going to wear out both their bearings and your eardrums. Failed fans are responsible for the _vast_ majority of GPU failures. Leave it.


I've never had a Sapphire GPU have fan failure and I run custom fan curves on mine that don't allow the fan to turn off and hotspot temps of 90c are bloody hot they don't need to be that high


----------



## Valantar (Jul 27, 2022)

Athlonite said:


> I've never had a Sapphire GPU have fan failure and I run custom fan curves on mine that don't allow the fan to turn off and hotspot temps of 90c are bloody hot they don't need to be that high


Hotspot temps of 90 are perfectly normal, and any GPU you've owned previously that only reported a single temperature likely had hotspots there or higher if it was reporting 70+°. Different coolers perform differently, and the differences vary at ilde or under load, and sure, some are lower. Still, an edge-hotspot delta of 20°C isn't _good_, but it isn't bad either. It's fine. And 90°C on a piece of silicon rated to operate - continuously for its entire lifespan, mind you - at 110°C is no problem whatsoever. There is absolutely zero reason to change anything based on this.

As for fan failures: running them closer to 100% is far more likely to wear out the bearings than never letting them stop. Higher fan speed means more friction, means more heat, means more wear, means earlier failure, and this scales non-linearly with fan speed.


----------



## FreedomEclipse (Jul 27, 2022)

Well im back. Been working on it on and off throughout the day because PCI-E Gen4 cards dont like being plugged into Gen 3 unless its set specifically in the bios for Gen 3. Lots of booting up without anything coming up on the monitor again. a lot of stuff happened, I ended up losing my 4.8Ghz OC profile because i took the bios battery out then decided to put a new one in. Couldnt remember my OC settings for the death of me. tried some of the settings that posted on TPU a good few years back and they didnt work. Failed prime 95 in less than 2 mins.

So im back to stock clocks on my 8600k but the 6700k is alive and well and is coming up as default monitor with the iGPu as secondary monitor for audio.

Played a little guildwars 2 on it, It didnt really punch any higher than my 1080Ti but then again that game is very badly optimised. Will be trying out some warframe later.

Oh and AMD Adrenalin.... what an absolute mess of an app. Back when i had 6970s, the app was so simple easy and straightforward but now there are so many things to click and scroll through and 90% of the stuff i have no idea what the hell its even about. Nvidia's Ui was so simple and straightforward.


----------



## Kabouter Plop (Jul 29, 2022)

Anyone else have gpu driver hangs during video calls like whatsapp on windows 11 22h2 ? with their 6000 series ?


----------



## AleXXX666 (Jul 29, 2022)

mine is Sapphire Pulse RX6600








						TechPowerUp
					






					www.techpowerup.com
				



oc 16% + vram 8%
i'm happy as f**k, this little a** gives very decent framerate in Hitman 3 in 2K res with help of FSR tech. same deal with GTA V in 2K runs pretty good.



kapone32 said:


> Well the weekend passed and I am going to share my experience with the 6800XT. As I stated before I had the card installed in my X399 build with a 2920X that boosts to 4.3 GHZ on the best cores. Using that system I got Firestrike scores of 25814 and the TWWH2 battle benchmark gave me 92.4 FPS. I ran a few others and they were all about 10 to 15% faster than my Vega 64.
> 
> On Saturday my Antec A400 arrived and it took about 45 minutes to change everything (MSI X570 Pro>>MSI X570 Unify)(RX5700>>6800XT)(Nepton 280<<<A400)(Gigabyte 650B>>XPG Gold 750) . When I ran the same benchmarks with my 5600x @ 4.7 ghz. TWWH2 gave me a 154 FPS average but Firestrike was *a ridiculous 35133 almost 10000 points higher. *
> 
> ...


you can pair with Intel Alder Lake too, lol


----------



## Kabouter Plop (Jul 29, 2022)

Forgot to mention but there new whatsapp desktop beta app that no longer has drag lag but had 2x black screen gpu driver crashes in a row


----------



## fullinfusion (Jul 29, 2022)




----------



## Kabouter Plop (Jul 29, 2022)

Seriously i dunno if its my memory being faulty or my gpu im getting fed up with these gpu driver crashes and black screens


----------



## Valantar (Jul 29, 2022)

fullinfusion said:


> View attachment 256315View attachment 256316View attachment 256317View attachment 256318


It's frustrating how hard it is to get ahold of those reference cards here in Sweden, given how goddamn gorgeous they are. Not that I'm complaining about my Liquid Devil (at all!), but honestly I'd much rather have a reference card and have added my own water block. Such a gorgeous design.


----------



## fullinfusion (Jul 30, 2022)

Valantar said:


> It's frustrating how hard it is to get ahold of those reference cards here in Sweden, given how goddamn gorgeous they are. Not that I'm complaining about my Liquid Devil (at all!), but honestly I'd much rather have a reference card and have added my own water block. Such a gorgeous design.


Yeah man I had a hunch to check out the AMD site and checked and low and behold I was able to get one. I always go for reference, why? dunno but I just love how clean the card looks and really cant find any fault with it at all. It sure works nice and the performance tab is nice and clean too.


----------



## Athlonite (Jul 30, 2022)

Kabouter Plop said:


> Seriously i dunno if its my memory being faulty or my gpu im getting fed up with these gpu driver crashes and black screens


What driver are you using maybe try the one version before it see if that makes a difference if it doesn't then chances are it's not a driver/GPU problem it's a Beta App problem report the issue to the App devs


----------



## Valantar (Jul 30, 2022)

fullinfusion said:


> Yeah man I had a hunch to check out the AMD site and checked and low and behold I was able to get one. I always go for reference, why? dunno but I just love how clean the card looks and really cant find any fault with it at all. It sure works nice and the performance tab is nice and clean too.


Lucky you! I spent a few afternoons furiously F5'ing their site back when I was buying my card, but this was in the midst of the worst of the shortage, so I had no chance whatsoever even knowing the exact time of stock refreshes.


----------



## Kabouter Plop (Jul 30, 2022)

Athlonite said:


> What driver are you using maybe try the one version before it see if that makes a difference if it doesn't then chances are it's not a driver/GPU problem it's a Beta App problem report the issue to the App devs


If had this with previous drivers as well even did clean install exactly did the same it feels like hardware issue but i dunno what is causing it
edit: read there may be issues with last 2 drivers especially with YouTube i do not remember if i had a tab open in background but i do use wallpaper engine so perhaps wallpaper engine and any accelerated video is causing gpu driver hangs.


----------



## AlwaysHope (Jul 30, 2022)

Signing up to this club.


----------



## FreedomEclipse (Jul 30, 2022)

If there are any people switching from green to red camp here -- Have you made the switch then discovered that red camp runs games like absolute garbage till the second or third time you load up the same game? Its like its caching stuff. At the same time its awkward because I can load up a game of BF4 and that game ran absolutely flawlessly.

Ive never had this phenomenon when moving between Nvidia cards. Warframe ran like absolute butthole the first time as did guildwars 2 but the more i played the better my fps got although im still not getting the same smooth performance in that game as i did my 1080Ti.

Im just curious if this is the way AMD did things. First time back with AMD since the ATi 6970 in 2010 so im not 100% upto date with the way they do things.


----------



## INSTG8R (Jul 30, 2022)

FreedomEclipse said:


> If there are any people switching from green to red camp here -- Have you made the switch then discovered that red camp runs games like absolute garbage till the second or third time you load up the same game? Its like its caching stuff. At the same time its awkward because I can load up a game of BF4 and that game ran absolutely flawlessly.
> 
> Ive never had this phenomenon when moving between Nvidia cards. Warframe ran like absolute butthole the first time as did guildwars 2 but the more i played the better my fps got although im still not getting the same smooth performance in that game as i did my 1080Ti.
> 
> Im just curious if this is the way AMD did things. First time back with AMD since the ATi 6970 in 2010 so im not 100% upto date with the way they do things.


It does in fact have a Shader Cache you can find under Advanced Graphics setup. That said I can't say I've experienced this kind of behaviour. I mean as a Beta Tester I might reinstall a new driver multiple times a week and most times a clean install so that would wipe anything cached by AMD


----------



## HD64G (Jul 30, 2022)

Kabouter Plop said:


> If had this with previous drivers as well even did clean install exactly did the same it feels like hardware issue but i dunno what is causing it
> edit: read there may be issues with last 2 drivers especially with YouTube i do not remember if i had a tab open in background but i do use wallpaper engine so perhaps wallpaper engine and any accelerated video is causing gpu driver hangs.


My PC started getting unstable a week before and after a day or so there were BSOD even when windows was loading. I just re-seated the RAM Dimms and all perfectly stable now. So, the RAM settings or the DIMMs themselves could create many stability problems.


----------



## Kabouter Plop (Jul 30, 2022)

HD64G said:


> My PC started getting unstable a week before and after a day or so there were BSOD even when windows was loading. I just re-seated the RAM Dimms and all perfectly stable now. So, the RAM settings or the DIMMs themselves could create many stability problems.



If had 1 stick not post before finding out 3 pins where dirty cleaned it up with alcohol and posted fine issues went away now im starting to have issues again now today if not had issues its seems to be happening random like days at a time stable followed by bunch of crashes rapidly in a row
running hci memtest pro now + heaven i never get errors, this getting extra frustating cos makes me wanna go back to Nvidia except their drivers are awful to gsync experience is the worst freesync always works.


----------



## Beer4Myself (Jul 30, 2022)

got a 6600 mech since april ... the worst thing on this card is the original fancurve ... it goes from 0-60% fanspeed if the gpu hits 52 degrees and if you got 60 degrees instant 100% fancurve and also plastic everywhere ... after the two years i snap the plastic off and may change to some of my f12 arctic fans with zipties


----------



## Kabouter Plop (Jul 30, 2022)

INSTG8R said:


> It does in fact have a Shader Cache you can find under Advanced Graphics setup. That said I can't say I've experienced this kind of behaviour. I mean as a Beta Tester I might reinstall a new driver multiple times a week and most times a clean install so that would wipe anything cached by AMD



If had no issues like that ever my shader compilation lag went away when i upgraded to 980 pro 2 tb nvme and now i have 2 not in raid tho.


----------



## Lew Zealand (Jul 30, 2022)

FreedomEclipse said:


> If there are any people switching from green to red camp here -- Have you made the switch then discovered that red camp runs games like absolute garbage till the second or third time you load up the same game? Its like its caching stuff. At the same time its awkward because I can load up a game of BF4 and that game ran absolutely flawlessly.
> 
> Ive never had this phenomenon when moving between Nvidia cards. Warframe ran like absolute butthole the first time as did guildwars 2 but the more i played the better my fps got although im still not getting the same smooth performance in that game as i did my 1080Ti.
> 
> Im just curious if this is the way AMD did things. First time back with AMD since the ATi 6970 in 2010 so im not 100% upto date with the way they do things.



I see this on occasion in a few games but I most obviously can induce this by loading up Horizon Zero Dawn and running the benchmark on my 5600XT.  The first run through is trash with horrendous frame dips especially the first time the camera moves through the town market square.  Second time you run the benchmark it works fine, nice and smooth.  The issue is reduced on my 6600XT and it also seems to have reduced with more recent drivers.  I'm on the 6400 PC with the new drivers, lemme try it now.

In terms of daily use, this effect is the difference I notice most often between Nvidia and AMD cards.  Most games have zero problems with this, especially DX11 ones and I'm trying to remember the others.  Heh, the other big one may have been Minecraft with the old drivers.

*Edit:*  The RX 6400 in a PCIe 3.0 Dell Optiplex 9020 (so PCIe bandwidth limited) worked flawlessly with a mix of Medium settings at 900p (what this card can play at) at 70fps average.  No fps dips.  Actually with 70fps I should bump that to 1080p.  OK I need to try the 5600XT again and that PC doesn't have the new driver yet.  Test a few things there with the old driver including Minecraft, and then I'll try the HZD benchmark again to see if there's been some sort of change there.


----------



## Kabouter Plop (Jul 31, 2022)

Just read this in notes latest drivers

Display may flicker black during video playback plus gameplay on some AMD Graphics Products such as the Radeon™ RX 6700 XT.
It does not flicker black it stays black for me, i run wallpaper engine normally i have disabled that for now altho i use scenes not video backgrounds but i guess this triggers 3D load during video calls that trigger same issue


----------



## INSTG8R (Jul 31, 2022)

Kabouter Plop said:


> Just read this in notes latest drivers
> 
> Display may flicker black during video playback plus gameplay on some AMD Graphics Products such as the Radeon™ RX 6700 XT.
> It does not flicker black it stays black for me, i run wallpaper engine normally i have disabled that for now altho i use scenes not video backgrounds but i guess this triggers 3D load during video calls that trigger same issue


It would be great if you could use the Bug Report Tool and file this with the details. Thanks


----------



## kapone32 (Jul 31, 2022)

Kabouter Plop said:


> Just read this in notes latest drivers
> 
> Display may flicker black during video playback plus gameplay on some AMD Graphics Products such as the Radeon™ RX 6700 XT.
> It does not flicker black it stays black for me, i run wallpaper engine normally i have disabled that for now altho i use scenes not video backgrounds but i guess this triggers 3D load during video calls that trigger same issue


There seems to be an issue with the Driver with logging (among other things). I cannot see my Gameplay results in Adrenline home, the Stress test finishes but then says it failed. I put in a new board and I am not getting the shutdowns but it is so sporadic that there must be bug for the Gaming scenarios. If I can't play TWWH I am not happy.


----------



## Athlonite (Aug 1, 2022)

kapone32 said:


> the Stress test finishes but then says it failed


same here with the exact same OC settings from previous driver to this one.  The previous driver passed when the test was run but now says Failed no matter what settings I run even stock still says Failed WTF AMD


----------



## kapone32 (Aug 1, 2022)

Athlonite said:


> same here with the exact same OC settings from previous driver to this one.  The previous driver passed when the test was run but now says Failed no matter what settings I run even stock still says Failed WTF AMD


I hope that we will see a response from AMD on when these issues will be resolved. I am sure AMD Reddit is on fire.


----------



## nolive721 (Aug 1, 2022)

For me and few others on Steam or forums, there are issues on simracing games like RF2 or AMS2 with Ctd or abnormally dark graphics to name a few


I know it’s niche compared to FPS or sport games users base but what came after 22.5.1 drivers was not good for us


----------



## GoldenX (Aug 1, 2022)

The 6500 XT I got for testing needed an RMA.
GPU prices fell to the ground due to Argentinian economic madness, the usual.
Told the vendor to get me a 6600 instead, I would pay the difference.
Thanks to the lower prices, I paid next to nothing.
So now I own a Sapphire Pulse RX 6600, and I'm loving it.


----------



## AleXXX666 (Aug 1, 2022)

Valantar said:


> It's frustrating how hard it is to get ahold of those reference cards here in Sweden, given how goddamn gorgeous they are. Not that I'm complaining about my Liquid Devil (at all!), but honestly I'd much rather have a reference card and have added my own water block. Such a gorgeous design.


what is the point of having ref design AND replacing HS with waterblock? PCB is PCB lol


----------



## Valantar (Aug 1, 2022)

AleXXX666 said:


> what is the point of having ref design AND replacing HS with waterblock? PCB is PCB lol


It's much easier to sell an air cooled card than a water cooled one, plus it's nice to have the option to switch to air cooling if I should want to move away from water at some point. Not that I'm likely to change my current build for a few years, but I'm definitely considering going all air at some point due to weight and simplicity.


----------



## AleXXX666 (Aug 1, 2022)

GoldenX said:


> The 6500 XT I got for testing needed an RMA.
> GPU prices fell to the ground due to Argentinian economic madness, the usual.
> Told the vendor to get me a 6600 instead, I would pay the difference.
> Thanks to the lower prices, I paid next to nothing.
> So now I own a Sapphire Pulse RX 6600, and I'm loving it.


i've had 3060ti in hard prices times just because i was tyred of no gaming lol
sold it out and got 6600. happy as baby lol
it's enough for me, 3060ti was overkill for my games (hitman 3 and GTA V at most, nfs a little)



Valantar said:


> It's much easier to sell an air cooled card than a water cooled one, plus it's nice to have the option to switch to air cooling if I should want to move away from water at some point. Not that I'm likely to change my current build for a few years, but I'm definitely considering going all air at some point due to weight and simplicity.


yea water is nice, but custom is such a mess i don't even bother lol


----------



## GoldenX (Aug 1, 2022)

AleXXX666 said:


> i've had 3060ti in hard prices times just because i was tyred of no gaming lol
> sold it out and got 6600. happy as baby lol
> it's enough for me, 3060ti was overkill for my games (hitman 3 and GTA V at most, nfs a little)


The lower power consumption alone makes up for it, the thing barely gets hot, and is small AF.


----------



## Athlonite (Aug 1, 2022)

AleXXX666 said:


> what is the point of having ref design AND replacing HS with waterblock? PCB is PCB lol


No unfortunately that isn't always the case Manufacturers are free to design their own most only use the reference design in the beginning runs until they've got their own design up and running so buying a reference air or water style cooling kit to use on a non reference design card may not work it all depends on how far from the reference design manufacturers departed from that design with their own


----------



## Valantar (Aug 1, 2022)

Athlonite said:


> No unfortunately that isn't always the case Manufacturers are free to design their own most only use the reference design in the beginning runs until they've got their own design up and running so buying a reference air or water style cooling kit to use on a non reference design card may not work it all depends on how far from the reference design manufacturers departed from that design with their own


True, though hardly an issue with my Liquid Devil Ultimate as it came with a water block and probably has a higher end PCB than the reference card. It's just a bit wasted on me given that I have no interest in OC'ing, plus I like the less ostentatious style of the reference card + some of the waterblocks for it better than the LDU, as well as their slightly smaller size. Still, in my case I got what was available at the time, and I'm really not complaining at all.


----------



## Kabouter Plop (Aug 1, 2022)

And i just crashed again black screen wtf is wrong with these drivers if no issues with gaming other then randomly getting black screen at times either during video call or during a game, i just reseated my gpu cleaned contacts to be sure there no way its my PSU and i doubt its my memory again i solved that already its not even running as close as hot where it would normally get errors. what are the odds memory is faulty last time it would not take 1 stick and it ended up with 1 stick having 3 dirty contacts i cleaned it and it works again crashes stopped now its giving me black screens why am i so unlucky, i could't get a single good G9 they all had dead pixels 1 valve index with bright pixel now my computer is feeling like its dying especially gpu im fed up, how do i trouble shoot this if i do not get any memory errors ?


----------



## Valantar (Aug 1, 2022)

Kabouter Plop said:


> why am i so unlucky, i could't get a single good G9 they all had dead pixels


They pretty much literally _all_ did, so that's not bad luck, it's just that monitor.

As for your black screen issues: does AMD Software give you driver crash error notifications afterwards? What does the Event Log tell you about what is going on with the system at the time (both System and Application logs)? Does only the screen go black, or does the system actually shut down (e.g. does audio keep playing)?


----------



## Kabouter Plop (Aug 1, 2022)

It does if i wait or if i do win+alt+ctrl+shift+b and i can bug report but no dmp file or anything, and when i restart whatsapp video call it usually black screens within 10 minutes now i installed 21.12.1 last years last driver and 25 minutes no black screen so far while on desktop or doing both videocall + playing world of warcraft, altho these black screens can vanish for days then reappear after days and consistently showup again as if i have a leak or something 1 drop causing a short until it dries up the next day or something altho im very confident i have no leaks.
I really like noise supression since i can apply it to others as well i have a cloud flight s so i can leave it on all the time without issue of it doing weird things to my own audio like music, i make a lot of whatsapp video calls with people that type a lot on keyboard which it can now filter out, so i am hoping the next driver adresses these black screens, if had memory issues couple months ago but i fixed that already set my ram speed from 3600 to 3200 and even turned off pbo and still black screened within 10 minutes.
20 minutes while gaming 20 minutes whatsapp video call while not gaming on 21.12.1 0 black screens so far.
edit: 1 hour and 13 minutes no issues so far on 21.12.1
2 hours and 12 minutes in total no black screen guess i be testing for least a week or maybe 2 unless AMD pushed 22.7.2 and fixes stability


----------



## AlwaysHope (Aug 2, 2022)

Impressed by the ridiculously low power consumption with an Asus RX 6800 XT TUF edition using 22.6.1 (WHQL) drivers playing _Wolfenstein: The New Order_ @ 1440p Ultra, too bad the game is capped at 60fps!
8 hrs playing so far it varies between 52w & 71w according to HWiNFO GPU Asic power readings in maximum output value column + the benefit of 0db fan noise!


----------



## Athlonite (Aug 2, 2022)

Surely that cap is only artificial though can you not edit some .ini file to remove it
A quick duckduck search finds this tidbit of info 

All you have to do is bring up the command console (while in-game) and type “toggle com_synctotime”. Now you’ll have to do this each and every time you launch the game. Or you can simply add “+toggle com_synctotime” as launch options and you won’t have to bother with this command ever again.


----------



## Kabouter Plop (Aug 2, 2022)

You can uncap the framerate of wolfenstein new order there are mods.


----------



## Chomiq (Aug 2, 2022)

AlwaysHope said:


> Impressed by the ridiculously low power consumption with an Asus RX 6800 XT TUF edition using 22.6.1 (WHQL) drivers playing _Wolfenstein: The New Order_ @ 1440p Ultra, too bad the game is capped at 60fps!
> 8 hrs playing so far it varies between 52w & 71w according to HWiNFO GPU Asic power readings in maximum output value column + the benefit of 0db fan noise!
> View attachment 256695
> 
> View attachment 256696


Seriously? The game has one of the bests performing engines in this decade and they opted for 60 fps cap? That's nonsense.


----------



## AVATARAT (Aug 2, 2022)

Kabouter Plop said:


> It does if i wait or if i do win+alt+ctrl+shift+b and i can bug report but no dmp file or anything, and when i restart whatsapp video call it usually black screens within 10 minutes now i installed 21.12.1 last years last driver and 25 minutes no black screen so far while on desktop or doing both videocall + playing world of warcraft, altho these black screens can vanish for days then reappear after days and consistently showup again as if i have a leak or something 1 drop causing a short until it dries up the next day or something altho im very confident i have no leaks.
> I really like noise supression since i can apply it to others as well i have a cloud flight s so i can leave it on all the time without issue of it doing weird things to my own audio like music, i make a lot of whatsapp video calls with people that type a lot on keyboard which it can now filter out, so i am hoping the next driver adresses these black screens, if had memory issues couple months ago but i fixed that already set my ram speed from 3600 to 3200 and even turned off pbo and still black screened within 10 minutes.
> 20 minutes while gaming 20 minutes whatsapp video call while not gaming on 21.12.1 0 black screens so far.
> edit: 1 hour and 13 minutes no issues so far on 21.12.1
> 2 hours and 12 minutes in total no black screen guess i be testing for least a week or maybe 2 unless AMD pushed 22.7.2 and fixes stability


I suppose that with the last drivers AMD touched the power control, which gives us a bit more performance but there is a more high power draw. So yeah you can get a restart or black screen so you need a bit more voltage or to reduce the clock a little.
About the black screen when I got it I wait for 10-20 sec and then I used Ctrl+Alt+Del this most times worked and weak up the display.


----------



## Kabouter Plop (Aug 2, 2022)

6900 XT has no way to increase voltage only undervolt and i only use rage profile for 24/7 use
You can restart gpu driver with *Win + Ctrl + Shift + B*


----------



## AlwaysHope (Aug 2, 2022)

Kabouter Plop said:


> You can uncap the framerate of wolfenstein new order there are mods.


& keep the physics in order? I've used crap like that before in FO4 & it did not go well.


Chomiq said:


> Seriously? The game has one of the bests performing engines in this decade and they opted for 60 fps cap? That's nonsense.


It is what it is.

However, the game is poorly optimised from what I've seen in vanilla. There are parts of the game where the fps drop below 50 on my system & the sound is distorted in some parts as well.


----------



## AVATARAT (Aug 2, 2022)

Kabouter Plop said:


> 6900 XT has no way to increase voltage only undervolt and i only use rage profile for 24/7 use
> You can restart gpu driver with *Win + Ctrl + Shift + B*


Rage is not power efficient. I use custom settings in Driver menu with undervolt and overclock.
Yes, but this does not always help to resolve the black screen.


----------



## Kabouter Plop (Aug 2, 2022)

AVATARAT said:


> Yes, but this does not always help to resolve the black screen.



No those will keep comming but for me it stays black for like 20 30 seconds so doing that speeds up the proces of driver being restarted


----------



## AVATARAT (Aug 2, 2022)

Kabouter Plop said:


> No those will keep comming but for me it stays black for like 20 30 seconds so doing that speeds up the proces of driver being restarted


It's depend on how hardly it crashed


----------



## Kabouter Plop (Aug 2, 2022)

AVATARAT said:


> It's depend on how hardly it crashed



If had it happen frequently when doing video calls in whatsapp there the standard whatsapp for destkop and the beta app had it during both but the beta has no drag lag anymore in windows 11 22h2 while if also had it in 21h2 with same 22.7.1 driver


----------



## Valantar (Aug 2, 2022)

Kabouter Plop said:


> 6900 XT has no way to increase voltage only undervolt and i only use rage profile for 24/7 use
> You can restart gpu driver with *Win + Ctrl + Shift + B*


... so maybe try without applying an auto OC profile? Not that the clock differences are likely to be significant from that, but it does boost the power limit a bit. It might be that the latest driver has tweaked something about how the card handles clock and voltage adjustments or similar, and that these specific tweaks now render the Rage profile unstable on your hardware. There's no guarantee for an OC to be stable after all, and _definitely_ no guarantee for an auto OC to be stable. Go back to balanced, or try some manual tweaking? The Rage profile is kind of stupid anyhow; you'll get higher boost clocks and more efficient operation by undervolting.


----------



## Kabouter Plop (Aug 2, 2022)

Rage profile is not oc i believe its just 3% increase those profiles should always be rock stable, anyway it more and more starting to look like most recent drivers being least stable so far that if ever had, and im pretty sure if never seen driver cause driver crashes like this before maybe once i dont remember which driver but its another recent driver i think, i guess i find out soon enough worse case scenario something is wrong with my system and i wont find out anytime soon what the cause is, lucky me tho i have a spare PSU evga 1300 g2 in my truenas server so in worse case scenario i test that to but im not in the mood to mount my psu in the case and redo all cabling


----------



## Valantar (Aug 2, 2022)

Kabouter Plop said:


> Rage profile is not oc i believe its just 3% increase those profiles should always be rock stable


There is no such thing as a stable OC profile outside of OC bioses provided by the board partner. AMD does not have the resources or time to test every partner GPU at every profile with all kinds of different surrounding components, test conditions and workloads. All OC is a risk, including an AMD-provided profile. Rid yourself of that belief, as it is just plain untrue.

Have you tested at stock settings? Have you tested with a manual OC to match the rage profile alongside a slight undervolt?


----------



## Kabouter Plop (Aug 2, 2022)

I dont have a third party gpu i have an AMD gpu quite litterally its bassicly stock with a waterblock slapped onto it, im not overclocking undervolting or anything except picking the least silent option aka rage profile since im watercooled its not overheating or anywhere close to overheating either.


----------



## GreiverBlade (Aug 2, 2022)

FreedomEclipse said:


> If there are any people switching from green to red camp here -- Have you made the switch then discovered that red camp runs games like absolute garbage till the second or third time you load up the same game? Its like its caching stuff. At the same time its awkward because I can load up a game of BF4 and that game ran absolutely flawlessly.
> 
> Ive never had this phenomenon when moving between Nvidia cards. Warframe ran like absolute butthole the first time as did guildwars 2 but the more i played the better my fps got although im still not getting the same smooth performance in that game as i did my 1080Ti.
> 
> Im just curious if this is the way AMD did things. First time back with AMD since the ATi 6970 in 2010 so im not 100% upto date with the way they do things.


nope ... although i had to DDU twice and install the latest drivers from adrenalin (and not form TPU ) but since then: everything went flawless  anything i play run 1620p60 max settings (noAA because who need AA @3k  )

tbf, i had way more issues with Nvidia driver than AMD 




AleXXX666 said:


> what is the point of having ref design AND replacing HS with waterblock? PCB is PCB lol


well, most waterblock manufacturer only do one fitting only ref design (i had that with my R9 290 back in the days) 
although it's not uncommon to see custom card with ref board design nonetheless ... (or some manufacturer like EK to list some speciall card, usually the most expensives like a "KingPong" yeah i know not the correct spelling )


----------



## AVATARAT (Aug 2, 2022)

Kabouter Plop said:


> I dont have a third party gpu i have an AMD gpu quite litterally its bassicly stock with a waterblock slapped onto it, im not overclocking undervolting or anything except picking the least silent option aka rage profile since im watercooled its not overheating or anywhere close to overheating either.


Just to show you temps/watts/fps:
my game profile - undervolt + overclock
one test (I just crate it) - undervolted
and Rage profile


----------



## Kabouter Plop (Aug 3, 2022)

Memory clock is 2000 missing in hwinfo cos of windows 11 22h2 install bug still shows up in amd control panel
Like i already mentioned its the 22.7.1 driver having issues they probably changed power saving options with certain setups to fix memory clocks during idle or power usage and broke something since it happens during video calls at desktop with nothing else
read some reports now that some users on 22.7.1 cant even pass AMD own stress test.


----------



## Kabouter Plop (Aug 4, 2022)

Decided to test 22.7.1 one more time after having had no blackscreens i learned to trigger it even faster in just 2 minutes doing call with my self


----------



## puma99dk| (Aug 6, 2022)

Kabouter Plop said:


> Decided to test 22.7.1 one more time after having had no blackscreens i learned to trigger it even faster in just 2 minutes doing call with my self



I am running 22.7.1 no issues yet using DP on my 4K@144Hz monitor.

I got a PowerColor Radeon RX 6800 XT Red Devil anyone have any good undervolting setting for this card?

Right now at I am doing 2424/2000MHz@1.050V if I go lower the card gets unstable so maybe this is max I can undervolt at default clocks.


----------



## GreiverBlade (Aug 6, 2022)

puma99dk| said:


> I am running 22.7.1 no issues yet using DP on my 4K@144Hz monitor.
> 
> I got a PowerColor Radeon RX 6800 XT Red Devil anyone have any good undervolting setting for this card?
> 
> Right now at I am doing 2424/2000MHz@1.050V if I go lower the card gets unstable so maybe this is max I can undervolt at default clocks.


my RX 6700 XT do about the same (a bit higher on the core tho but that's normal ) i guess 1050mV is indeed the lowest stable undervolt 
1200mV is the default for me 

ah did not notice the 22.7.1 update


----------



## Kabouter Plop (Aug 8, 2022)

updated my watercooling with temp sensor to see how hot the water gets almost thought it did not work at all it would not go higher then 31c, so i set my fans at 700 rpm it gets as hot as 37c at 700 rpm

I peeked 69c hotspot 53c gpu temp
avg went down to about 63c hotspot 49c gpu temp after ramping up to 100% with water temp down 33c
standing here at 3840x1600 with ray tracing on high makes my gpu get way hotter then any other games i usually top around 59c 60c hotspot at best 62c
my gpu boost about 2420 mhz here with rage profile which sets the 6900 XT power limit to 270.3 watts normally 255 stock

For those watercooling what water temp you target for fan curve ? just wondering.


----------



## Valantar (Aug 9, 2022)

Kabouter Plop said:


> updated my watercooling with temp sensor to see how hot the water gets almost thought it did not work at all it would not go higher then 31c, so i set my fans at 700 rpm it gets as hot as 37c at 700 rpm
> 
> I peeked 69c hotspot 53c gpu temp
> avg went down to about 63c hotspot 49c gpu temp after ramping up to 100% with water temp down 33c
> ...


Given that I'm cooling a 6900XTX and 5800X on a single 280mm rad, I don't have a set water temp target - that's just not realistic. At full power in warm ambients (~30°C) I've never seen it go higher than ~43°C though, and in most gaming scenarios it sits much lower than that (especially with my regular underclock preset). My GPU never exceeds 65°C edge temp - I can't recall the peak hotspot temp I've seen but it's definitely sub-80.


----------



## Kabouter Plop (Aug 9, 2022)

hotspot is visible as junction temp in amd control panel or hotspot temp in hwinfo usually about 15 - 22c higher more you push the bigger the difference potentially is at stock i would expect 14c 15c difference especially on good mount.
I have triple radiator 2x 45mm 1x slim forgot the thickness probably about 25 30mm


----------



## Atheist Jr (Aug 9, 2022)

Just submitted the bios for my 6950: https://www.techpowerup.com/vgabios/248132/248132


----------



## Valantar (Aug 9, 2022)

Kabouter Plop said:


> hotspot is visible as junction temp in amd control panel or hotspot temp in hwinfo usually about 15 - 22c higher more you push the bigger the difference potentially is at stock i would expect 14c 15c difference especially on good mount.
> I have triple radiator 2x 45mm 1x slim forgot the thickness probably about 25 30mm


... I know where to find the hotspot temp, I was saying that unlike edge temps I can't remember quite where the hotspot temp typically sits. From what I can remember though, my deltas tend to be closer to 10°C than 20. Edge-hotspot deltas vary quite a lot between coolers and installations.


----------



## AVATARAT (Aug 10, 2022)

> The 6000 series GPU cards Maximum Operating Temperature for Hot Spots (Junction Temperature) is 110c.
> There are several Thermal Sensors throughout the GPU card. If any reaches 110c it will automatically start to slow down or throttle to maintain the temperature at 110c or lower.
> There are two Temperature Readings for the GPU card. One is for the general temperature of the GPU card while the other is the Hot Spot (Junction Temp) temperature.
> So as long as the temperature never reaches 110c your are fine. But it is best to try and keep the hot spot as low as possible.


So I can't see a problem to use it on the 85-90C Junction.
Link.


----------



## AleXXX666 (Aug 10, 2022)

GreiverBlade said:


> nope ... although i had to DDU twice and install the latest drivers from adrenalin (and not form TPU ) but since then: everything went flawless  anything i play run 1620p60 max settings (noAA because who need AA @3k  )
> 
> tbf, i had way more issues with Nvidia driver than AMD
> 
> ...


oh, then it's the point. yeah, buying overprice sh!t like "kingkong" (kingpin) or "sh!tbauer" (der8auer) edition only makes sense if you are their fanatic. i am not a fanatic so i can't approve paying 3x price of the alternatives lol.


----------



## Valantar (Aug 10, 2022)

AleXXX666 said:


> oh, then it's the point. yeah, buying overprice sh!t like "kingkong" (kingpin) or "sh!tbauer" (der8auer) edition only makes sense if you are their fanatic. i am not a fanatic so i can't approve paying 3x price of the alternatives lol.


Tbh, a lot of der8auer's stuff is quite reasonably priced. The delid die mate is, what €25? I wouldn't say €30 for a precision machined piece of aluminium like that LGA170P frame is "overpriced" either. Can it be made more cheaply in lower cost countries, by cutting corners in production (including lower precision machining), etc? Obviously. And that is ultimately fine - it's not like the TR version is a ripoff, as it's plenty clear several people had the same idea at once. But does the existence of a €5-on-Aliexpress version make the €30 version overpriced? No.

There's a ton of expensive engineering that goes into things like Kingpin edition GPUs too. The only issue is that this engineering is completely useless for most end users, and adds zero real value. But for anyone doing extreme OC it might have some value. Different uses have different needs. And, of course, if you're one of those people who buys a Kingpin edition gpu because it has a cool name on it, well, then you're the sucker that's paying for all of those sponsored extreme OCers.


----------



## AleXXX666 (Aug 10, 2022)

Valantar said:


> Tbh, a lot of der8auer's stuff is quite reasonably priced. The delid die mate is, what €25? I wouldn't say €30 for a precision machined piece of aluminium like this frame is "overpriced" either. Can it be made more cheaply in lower cost countries, by cutting corners in production (including lower precision machining), etc? Obviously. And that is ultimately fine - it's not like the TR version is a ripoff, as it's plenty clear several people had the same idea at once. But does the existence of a €5-on-Aliexpress version make the €30 version overpriced? No.


i mean "highly-binned" cpus. this kindda sh!t... i dunno.
you actually pay VERY PREMIUM for testing for OC a portion of cpus. if you oc for oc, then it's ok. but, if you oc for performance, then you could purchase next level of cpu for same price lol


----------



## Valantar (Aug 10, 2022)

AleXXX666 said:


> i mean "highly-binned" cpus. this kindda sh!t... i dunno.
> you actually pay VERY PREMIUM for testing for OC a portion of cpus. if you oc for oc, then it's ok. but, if you oc for performance, then you could purchase next level of cpu for same price lol


Oh, of course, binned CPUs are always stupidly priced. It's a luxury product aimed at people wanting to OC but not wanting to spend the time needed to do so. The value proposition is essentially zero, or along the lines of "for the people with lots of money but little time". Just another way of flexing your wealth.

And sorry for the ninja edits btw


----------



## Kabouter Plop (Aug 11, 2022)

Well looks like blackscreen freezes where video related lets hope they fix it next driver.
New driver no blackscreens so far noticed different behaviour gpu power fluctates more during whatsapp calls while its steady without doing call so i guess they made some changes to focus on stability.
Another unstable driver took a little longer this time, whats even more of an insult is this game ready driver for spider man remastered there no way this should be unstable.


----------



## jext (Aug 14, 2022)

I think I got a pretty good 6800 sample - open box on amazon too 









						Result not found
					






					www.3dmark.com
				




I am really tempted to start using MPT because there seems to be more OC headroom, everything maxed in wattman and no artifacts.  Max junction temperature 80 with 70% max fan


----------



## Athlonite (Aug 14, 2022)

@jext  so I see you overclocked the crap outta it but did not use a custom fan curve or turn of the stupid Zero RPM crapolla 

turn on Advanced Control in the fan section turn off zero rpm and make a custom fan profile 

you also seem to be lucky with the power limit being 15% mines only 10% what make of RX6800 do you have


----------



## jext (Aug 14, 2022)

Athlonite said:


> @jext  so I see you overclocked the crap outta it but did not use a custom fan curve or turn of the stupid Zero RPM crapolla
> 
> turn on Advanced Control in the fan section turn off zero rpm and make a custom fan profile
> 
> you also seem to be lucky with the power limit being 15% mines only 10% what make of RX6800 do you have


Thanks I'll be doing that next because the fans do get pretty loud at 70%, the model I have is MSI Gaming X Trio


----------



## kambei.7s (Aug 19, 2022)

I got RX 6600 Eagle card from gigabyte, big and cheap maked I can say, plastic "backplate", thermal pads for memory on same heat pipes with graphic chip. Card quiet below 60% of fan speed. It's cheapeast than 3060 and 3050 in my country on moment when I buy it. Temperatures ok on that card, under witcher 3 on ultra without vsync I have 75C on hot spot and 68 on chip, BFV5 - 78 hotspot and 70 chip, zalman z9 u3 case coolers on minimum. My previous video cards have better overall quality except - GTX 1060 Gaming v2 - after years cooler start making noise if I manually set it more to 55% and radeon 7950 winforce 3- have OC bug, bios flashed to undervolt and core clock down, both still work fine), and one from sapphire rx 570, almost perfect except needed manual cooler control
That card have every 5-10 min stutters (frametime jumps from 6ms to 150ms by amd driver stats, afterburner show 50ms spikes) in some games, good example BFV5 with 150 FPS and 6ms I can't play comfortable because of that, my GTX 1060 6, provide better expirience even with slowest cpu on same high settings (r5 1600ae vs r5 3600 without boost), as I understand that problem for all 6xxx series, I turned off SAM, this not helped much. I already regretted that buy, better to buy some rtx 3050, slow - yes, but without that amd driver lags. Sorry for my bad English


----------



## Valantar (Aug 19, 2022)

kambei.7s said:


> That card have every 5-10 min stutters (frametime jumps from 6ms to 150ms by amd driver stats, afterburner show 50ms spikes) in some games, good example BFV5 with 150 FPS and 6ms I can't play comfortable because of that, my GTX 1060 6, provide better expirience even with slowest cpu on same high settings (r5 1600ae vs r5 3600 without boost), as I understand that problem for all 6xxx series, I turned off SAM, this not helped much. I already regretted that buy, better to buy some rtx 3050, slow - yes, but without that amd driver lags. Sorry for my bad English


Sounds like the type of bug that ought to be reported to both AMD and whoever is developing those games. Can't imagine an issue like that being too hard to fix, either on a driver or game code level.


----------



## kambei.7s (Aug 19, 2022)

Sounds easy, yes. But it will take years then AMD fix that problem, as always. Some games doesnt supported well on start and especially after years from release. I think the problem is in clock spikes, vga sleeping sometimes and dropping clocks too low.


Valantar said:


> Sounds like the type of bug that ought to be reported to both AMD and whoever is developing those games. Can't imagine an issue like that being too hard to fix, either on a driver or game code level.


----------



## Kabouter Plop (Aug 19, 2022)

If the gpu is under utilized just set everything as high as possible rendering scale included if possible, if had that problem a lot when i still had a 1080p screen, because i went thru hell trying to get a proper Samsung G9 cos it always had dead pixels and then tried for a Neo that came doa with pixel warranty followed by replacement never reaching me cos of a bright pixel dead center, so i canceled cos i got fedup and got a good LG ultrawide first try.
Just try playing golf with your friends on a 6900 XT at 1080p it wont even trigger proper 3D clocks


----------



## Valantar (Aug 19, 2022)

Kabouter Plop said:


> If the gpu is under utilized just set everything as high as possible rendering scale included if possible, if had that problem a lot when i still had a 1080p screen, because i went thru hell trying to get a proper Samsung G9 cos it always had dead pixels and then tried for a Neo that came doa with pixel warranty followed by replacement never reaching me cos of a bright pixel dead center, so i canceled cos i got fedup and got a good LG ultrawide first try.
> Just try playing golf with your friends on a 6900 XT at 1080p it wont even trigger proper 3D clocks


I've seen the same thing with my 6900 XT at 1440p in lightweight titles. It has improved quite a bit since I first got it though.



kambei.7s said:


> Sounds easy, yes. But it will take years then AMD fix that problem, as always. Some games doesnt supported well on start and especially after years from release. I think the problem is in clock spikes, vga sleeping sometimes and dropping clocks too low.


That's a possible explanation, absolutely. Fixes always taking years is BS though. AMD, like all large hardware vendors, have weird issues that take time to fix, but that doesn't mean all fixes take that kind of time. And, of course, when the fault is game specific, it might not be something AMD can fix, but it might require the developer to intervene, or it might require the two to work together - the latter of course making any fix far harder to make and implement. Either way, report your bugs to both developers and AMD, and if nothing happens, keep pestering them. That's the only way to do this.


----------



## Kabouter Plop (Aug 19, 2022)

Question especially for those using 22.8.1 do you leave hardware acceleration on everywhere or turn it off everywhere or just few apps ?
speaking of 22.8.1 this is how AMD own stress test behaves



Normally it maxes out 270 power consumption with much lower clocks like 200 mhz lower i am not overclocking at all whatsoever it does this on default tuning profile as well as rage tuning profile.
Just tested with all hardware acceleration off and still black screened during whatsapp calls, but for some reason if i do a call with my self i never get a black screen as if specific codec from different type of webcam is causing issues


----------



## Ajp2022 (Aug 20, 2022)

Hi,

Just got an Sapphire Pulse RX 6600 replacing a Aorus RX 580.  A few issues l've had that others may be able to help with (thanks!).

Spec: 

B450 DSh3 mobo
Ryzen 2600 CPU 
Crucial Ballistix 32GB 3200 (running at 2966) due to instability on stock settings 
Monitor Acer VG240gy 
EVGA 6000W PSU 
Adrenalin 22.5.1
Win 10 Pro 

I've got the new GPU installed and running on . However when booting up it shows "input not supported" for a few seconds it then loads Windows login screen. Fast boot is disabled in BIOS and Windows power settings  I don't see any BIOS options when it is booting. 

If l select shift and restart in Windows to get into the BIOS it restarts but l get the same message and l have to restart the PC as the BIOS display doesn't show. 

It is as if it isn't initially detecting the monitor so l'm not seeing the display where l'd access the BIOS options.. The only way l can get the BIOS showing is taking out the CMOS battery and it displays on the initial boot but after another boot cycle the problem is back. 

I heard that using a hdmi 2.1 cable could help? 

Thanks if you have any insights on this.


----------



## Athlonite (Aug 20, 2022)

That can happen if you leave the connection setting on your monitor set Auto try setting it to HDMI 1 or DP (Displayport) depending which you use preferably you should be using the DP port on GPU and Monitor 

if you go into you monitor OSD and find where it has the input selection menu and manually set to whichever port you're using


----------



## Ajp2022 (Aug 20, 2022)

Hi,

Thanks for your response. I don't have dp on my monitor so am using hdmi. I do have a hdmi to dp adaptor but that is not in use. I've changed the input to HDMI 1 and turned off the auto source. Sadly the issue is persisting. I didn't have this with the RX580 and when l switch out to that GPU it goes away. 

It is as if the RX 6600 only kicks in once the OS has reached the login screen.


----------



## kambei.7s (Aug 20, 2022)

Valantar said:


> I've seen the same thing with my 6900 XT at 1440p in lightweight titles. It has improved quite a bit since I first got it though.
> 
> 
> That's a possible explanation, absolutely. Fixes always taking years is BS though. AMD, like all large hardware vendors, have weird issues that take time to fix, but that doesn't mean all fixes take that kind of time. And, of course, when the fault is game specific, it might not be something AMD can fix, but it might require the developer to intervene, or it might require the two to work together - the latter of course making any fix far harder to make and implement. Either way, report your bugs to both developers and AMD, and if nothing happens, keep pestering them. That's the only way to do this.


As end user I bought card and it's big money for me, I want to play games on it out of box, that card more enought to play FHD in optimized old titles like BFV5. Why I must wait when AMD fix that? Good what I have another build to play in comfort. For example they never fix Skyrim/bf3 GPU low usage problem (radeon HD 6xxx series), it was started from 14.4 version and not fixed

I fixed clocks min/max 2044/2144 and it's reduced lags, but it's not dissapear completely, not so noticeable now


----------



## Valantar (Aug 20, 2022)

kambei.7s said:


> As end user I bought card and it's big money for me, I want to play games on it out of box, that card more enought to play FHD in optimized old titles like BFV5. Why I must wait when AMD fix that? Good what I have another build to play in comfort. For example they never fix Skyrim/bf3 GPU low usage problem (radeon HD 6xxx series), it was started from 14.4 version and not fixed


I completely understand where you're coming from, and I entirely agree that this should be fixed - the sooner, the better. The problem is, bugs happen. And they can be weird and elusive and hard to pin down - or fixing them might be outside of the control of GPU makers. And that _sucks_. But all anyone outside of the GPU makers or game devs (or API vendors to some extent) can really do, is report bugs, keep reporting them if they go unfixed, and wait it out. Or choose one of the two nuclear options: try another GPU, or play another game. And that's obviously not an ideal solution to anything, but it's the only things that can be done. Luckily, AMD has been putting a lot more money into its GPU driver development in recent years since they're no longer on the verge of bankruptcy, and it's been showing for quite a while through far fewer bugs, far faster bug fix rollouts, far more frequent driver releases, and more. Which also makes drawing on decade-old comparisons rather misleading, given how vastly different these things were then vs. now. I'm obviously not saying AMD's GPU drivers are problem free - no drivers are. You just seem to have the bad luck of getting a particular bug in a game you play a lot, which obviously sucks for you, and should indeed be addressed. The question is how reproducible the bug is, what is causing it, and whether it's within AMD's power to fix. One would hope with a game as big as BF5 it would have been addressed long ago, but apparently not, which would seem to suggest that it's either very difficult to reproduce and diagnose, or needs the intervention of someone outside of AMD. None of that helps you, of course, but it serves to explain why these things are some times not as simple as they seem.


----------



## AlwaysHope (Aug 22, 2022)

Does this series of cards go well in DX9 games?


----------



## AVATARAT (Aug 22, 2022)

kambei.7s said:


> As end user I bought card and it's big money for me, I want to play games on it out of box, that card more enought to play FHD in optimized old titles like BFV5. Why I must wait when AMD fix that? Good what I have another build to play in comfort. For example they never fix Skyrim/bf3 GPU low usage problem (radeon HD 6xxx series), it was started from 14.4 version and not fixed
> 
> I fixed clocks min/max 2044/2144 and it's reduced lags, but it's not dissapear completely, not so noticeable now


You can try to set the in-game max FPS to be a bit lower or you can try to do it through Adrenalin *Frame Rate Target Control*.
These both can fix micro stutters in some titles.


----------



## Kabouter Plop (Aug 22, 2022)

New 22.8.2 drivers even more unstable then 22.8.1
Before it least took 10 minutes to black screen and freeze, with 22.8.2 i can reproduce it within 1 minute avg consistently, i do 2 video calls a day sometimes 3 so its pretty annoying if i can't even do video calls like normally would.



AlwaysHope said:


> Does this series of cards go well in DX9 games?



No idea but dxvk exist so thats an option usually increases fps by a lot more then you probably would need


----------



## AlwaysHope (Aug 23, 2022)

Kabouter Plop said:


> New 22.8.2 drivers even more unstable then 22.8.1
> Before it least took 10 minutes to black screen and freeze, with 22.8.2 i can reproduce it within 1 minute avg consistently, i do 2 video calls a day sometimes 3 so its pretty annoying if i can't even do video calls like normally would.



Thanks for the heads up. I'm sticking to 22.7.1 until the next update for WHQL drivers. I've had enough of being an end user beta tester, get that enough just running win 10 & keeping it up to date! 



Kabouter Plop said:


> No idea but dxvk exist so thats an option usually increases fps by a lot more then you probably would need



Thanks, I'll look up dxvk, but just want to play Oblivion & I'm sure that's going to be capped at 60fps anyway.


----------



## Kabouter Plop (Aug 23, 2022)

AlwaysHope said:


> Thanks for the heads up. I'm sticking to 22.7.1 until the next update for WHQL drivers. I've had enough of being an end user beta tester, get that enough just running win 10 & keeping it up to date!
> 
> 
> 
> Thanks, I'll look up dxvk, but just want to play Oblivion & I'm sure that's going to be capped at 60fps anyway.



22.7.1 has those issues as well actually.


----------



## AlwaysHope (Aug 23, 2022)

Kabouter Plop said:


> 22.7.1 has those issues as well actually.


I haven't experienced anything out of the ordinary, but as usual depends what games & apps your running.


----------



## Kabouter Plop (Aug 23, 2022)

AlwaysHope said:


> I haven't experienced anything out of the ordinary, but as usual depends what games & apps your running.



Try doing whatsapp videocalls on 22.8.2 i can trigger black screen within 1 min avg on 22.8.1 it takes about 10 minutes sometimes or its fine for an hour same with 22.7.1


----------



## AlwaysHope (Aug 23, 2022)

Kabouter Plop said:


> Try doing whatsapp videocalls on 22.8.2 i can trigger black screen within 1 min avg on 22.8.1 it takes about 10 minutes sometimes or its fine for an hour same with 22.7.1


I don't have enough of a busy social life to use that app. So it's irrelevant to my usage model. Not only that even if I did, would never touch that meta platforms malware.


----------



## Kabouter Plop (Aug 23, 2022)

So appearntly if you turn on new flip mode option for games with new AMD drivers you can speedrun black screens the moment i turned it off it last a lot longer not sure how much longer hopefully its the cause of all black screens pretty disapointing that option is not supported on newer drivers, altho it should be non issue mostly effects old games, all dx12 vulkan tittles already have flip mode, and it never needed it, my only intrest is because it lowers input lag


----------



## AlwaysHope (Aug 23, 2022)

Is flip mode in 22.7.1 drivers?


----------



## Athlonite (Aug 23, 2022)

Kabouter Plop said:


> 22.7.1 has those issues as well actually.


yup this is why I went back to 22.6.1 atleast these work without problems


----------



## kambei.7s (Aug 28, 2022)

I noticed my rx 6600 cooler control linked to GPU load not temperatures, for example, running old skyrim with 25% of load (still low GPU usage on some AMD cards,  since 14.4 driver) heats card to 63C and cooler won't turn on...that's something.

Also surprised by a OpenGL performance, it's really bad.

Found some workaround to remove lags, *disabled PCI Express link control in power plan* (Ryzen balanced), turned off vsync - always off and free sync from optimized by amd to off. No FPS limitation - No lags. Also disabled fTPM feature.

How bad if PCI Link management disabled?


----------



## Kabouter Plop (Aug 29, 2022)

AlwaysHope said:


> Is flip mode in 22.7.1 drivers?



Its part of windows 11 22h2 flip mode is technically there by default for many modern well optimised games especially dx12 or vulkan but usually not preset in dx11 or older can be tho.


----------



## GreiverBlade (Aug 29, 2022)

i am on 22.8.2 since they released .... whatsapp 1hrs vid call with my mother, lot of games and other task,zero issues  well i am on Win10 21H1 that might be it ... 
although the whatsapp bug if it did happen, would not matter much ... i rather use my phone for whatsapp vicall, my webcam is not ... well ... it's an old logitech from early 2000s  i need to upgrade that one  

that continue my "i had zero rollback with AMD/ATI since my 3D Prophet 9700, unlike Nvidia where i had to rollback quite often" (almost as often as those who post issues about latest driver here ) 
i guess AMD/ATI is fan of me and treat me favorably (or i am a total lucksack  )


----------



## Kabouter Plop (Aug 29, 2022)

If had times where it would not crash for longer then an hour and then black screen within 1 minut consistently you know when you have it when you do calls daily it does not matter in my case if i do driver only either, i do not have a overclocked gpu if had it on 21h2 as well on 22h2 it happens more frequently tho if you enable flip mode i think or may have been coincidence i really don't know if given up on those drivers i will wait for next driver and test again hopefully its finally a recommended stable driver.


----------



## GreiverBlade (Aug 29, 2022)

Kabouter Plop said:


> If had times where it would not crash for longer then an hour and then black screen within 1 minut consistently you know when you have it when you do calls daily it does not matter in my case if i do driver only either, i do not have a overclocked gpu if had it on 21h2 as well on 22h2 it happens more frequently tho if you enable flip mode i think or may have been coincidence i really don't know if given up on those drivers i will wait for next driver and test again hopefully its finally a recommended stable driver.


sad that you have issues, well for me i am glad the drivers are stable with my profile usage.


----------



## Kabouter Plop (Aug 29, 2022)

GreiverBlade said:


> sad that you have issues, well for me i am glad the drivers are stable with my profile usage.



Just keep testing i still do not understand what triggers it, if heard for some turning off hardware acceleration helps i personally always run with wallpaper engine, if had 1 time i disabled wallpaper engine and i had no issues for a whole day until i did a whatsapp video call and instantly blackscreened within 10 minutes while i had a call in the morning as well where did 15 minute call no problem its very random, ofcourse i have wallpaper engine running on second display as well.


----------



## GreiverBlade (Aug 29, 2022)

Kabouter Plop said:


> Just keep testing i still do not understand what triggers it, if heard for some turning off hardware acceleration helps i personally always run with wallpaper engine, if had 1 time i disabled wallpaper engine and i had no issues for a whole day until i did a whatsapp video call and instantly blackscreened within 10 minutes while i had a call in the morning as well where did 15 minute call no problem its very random, ofcourse i have wallpaper engine running on second display as well.


i also run Wallpaper engine on a daily basis on my main rig, as you said since i do not use desktop Whatsapp quite often that might be why i do not have any issues (well whatsapp is mainly for my work group and my parent, messages and calls and rarely video call ) 

also as i mentioned once i have HW accelleration on, youtube? no issues (in browser setting and AV1 when available) but i use Opera GX atm ...


----------



## kapone32 (Aug 29, 2022)

I was on 22.7.1 and it was solid for about 2 weeks before I had to actually reset Windows. I have been on 22.8.1 and the best test I could do was play TWWH Immortal Empires all day yesterday with no shutdowns. It had been solid though as I also have been playing plenty of Everspace2 and The Ascent as well. I hope that this solves whatever the black screen issue is caused by.


----------



## GreiverBlade (Aug 29, 2022)

kapone32 said:


> I was on 22.7.1 and it was solid for about 2 weeks before I had to actually reset Windows. I have been on 22.8.1 and the best test I could do was play TWWH Immortal Empires all day yesterday with no shutdowns. It had been solid though as I also have been playing plenty of Everspace2 and The Ascent as well. I hope that this solves whatever the black screen issue is caused by.


just noticed i had one issue when i initially installed the cards weeks ago, i wonder if it was because of the drivers or just the game install, i could not launch DOOM Eternal Vulkan, kept saying no physicall GPU, did not bother to retest until today   i never knew 1620p Vulkan, ultra nightmare preset RT on would run in the 70s fps + (70-98fps no major tearing or stutters)


----------



## AVATARAT (Sep 1, 2022)

I am on 22.8.2 since they launch it, reinstall it time yesterday, but it's not stable and can freeze/black screen/restart PC on settings that "must be stable". It need a few points more voltage on the same clock.
Maybe 22.7.1. is best between 22.5.1-22.8.2 but actually, older versions are more stable.


----------



## Kabouter Plop (Sep 2, 2022)

I really wish they did a proper driver already for windows 11 22h2 Microsoft released a new driver again i know their opengl driver is stable for me, but the AMD control panel does not work unless i reinstall the entire system, because it refuses to install whole driver, another thing atleast for previous Microsoft drivers do not support ray tracing at all whatsoever which is annoying, i run with ray tracing on max at 3840x1600 in wow and render resolution at 77% with FSR 1.0 enabled
Also the 22.5.1 installer and i think 22.6.1 as well as well as 22.5.2 if you use AMD clean up tool then install these on 22h2 and you reboot all hardware drops out
Network wifi usb all your storage except for windows storage it all drops out as if you mainboard just given you the middle finger exclusively happens on 22h2 only however, i do not remember if 22.7.1 and up did this as well since they are technically drivers for 22h2 but those aren't stable.


----------



## Athlonite (Sep 3, 2022)

Kabouter Plop said:


> I really wish they did a proper driver already for windows 11 22h2 Microsoft released a new driver again i know their opengl driver is stable for me, but the AMD control panel does not work unless i reinstall the entire system, because it refuses to install whole driver, another thing atleast for previous Microsoft drivers do not support ray tracing at all whatsoever which is annoying, i run with ray tracing on max at 3840x1600 in wow and render resolution at 77% with FSR 1.0 enabled
> Also the 22.5.1 installer and i think 22.6.1 as well as well as 22.5.2 if you use AMD clean up tool then install these on 22h2 and you reboot all hardware drops out
> Network wifi usb all your storage except for windows storage it all drops out as if you mainboard just given you the middle finger exclusively happens on 22h2 only however, i do not remember if 22.7.1 and up did this as well since they are technically drivers for 22h2 but those aren't stable.



If you haven't already done so I'd suggest you update your BIOS to Version 4403 AGESA 1207 there's several fixes that may alleviate some if not all your problems with Windows 11 22H2


----------



## MTF96zn (Sep 4, 2022)

Hello guys I wonder if any one knows the size and dimensions and thickness of the thermalpads used
On the XFX RX 6700XT QICK 319 ?


----------



## Kabouter Plop (Sep 7, 2022)

Athlonite said:


> If you haven't already done so I'd suggest you update your BIOS to Version 4403 AGESA 1207 there's several fixes that may alleviate some if not all your problems with Windows 11 22H2



I would suggest to downgrade to 1203c if you use PBO doing so will safe you a lot of headaches due EDC bug


----------



## Kabouter Plop (Sep 12, 2022)

Just tested 22.6.1 also that driver has blackscreen timeouts during whatsapp video callls


----------



## Athlonite (Sep 12, 2022)

Kabouter Plop said:


> I would suggest to downgrade to 1203c if you use PBO doing so will safe you a lot of headaches due EDC bug


4403 seems to be working fine for me on my ROG Strix x570 F - Gaming mobo


----------



## Kabouter Plop (Sep 14, 2022)

Athlonite said:


> 4403 seems to be working fine for me on my ROG Strix x570 F - Gaming mobo



Its only a problem if you use PBO even with default settings EDC will limit vcore past default its bassicly non functional which is not an issue with 1203c, it even effects stability.


----------



## Athlonite (Sep 14, 2022)

so manually set your EDC limit to something higher problem solved


----------



## Kabouter Plop (Sep 19, 2022)

Athlonite said:


> so manually set your EDC limit to something higher problem solved



I think you do not understand what edc bug is, it limits it to no higher then default aka stock you have bassicly no PBO, the real solution is to not upgrade past 1203c


----------



## Athlonite (Sep 20, 2022)

Kabouter Plop said:


> I think you do not understand what edc bug is, it limits it to no higher then default aka stock you have bassicly no PBO, the real solution is to not upgrade past 1203c


Oh OK I don't seem to have that problem even with 1207


----------



## Kabouter Plop (Sep 20, 2022)

Athlonite said:


> Oh OK I don't seem to have that problem even with 1207



Like i said its PBO bug judging from your specs you using manual overclock


----------



## MachineLearning (Sep 20, 2022)

Kabouter Plop said:


> 22.7.1 has those issues as well actually.


Yep, ever since the OpenGL/CL rewrite these GPUs will occasionally crash in some conditions. Specifically, I found that having a fullscreen OpenGL game in one monitor and a fullscreen video on another will cause a crash within a couple hours most of the time.



Kabouter Plop said:


> I think you do not understand what edc bug is, it limits it to no higher then default aka stock you have bassicly no PBO, the real solution is to not upgrade past 1203c


Not quite: EDC bug limits CPU core voltage to no more than 1.4v nominal. This severely limits ST boosting since Zen will happily ask for 1.5v for peak SC clocks.

The solution is to either use below 140A EDC (bug will not occur and CPU asks for up to 1.5v), or as you say, using an AGESA before 1.2.0.4. Preferably 1.2.0.0 - 1.2.0.3


----------



## AVATARAT (Sep 22, 2022)

I recommend the last driver 22.9.1 it seems stable with OC too.


----------



## Kabouter Plop (Sep 22, 2022)

AVATARAT said:


> I recommend the last driver 22.9.1 it seems stable with OC too.



Nah its not stable i can speed run black screens even on 22.9.1 if configured the same as i have on 22.5.1 that is stable.


----------



## puma99dk| (Sep 22, 2022)

Kabouter Plop said:


> Nah its not stable i can speed run black screens even on 22.9.1 if configured the same as i have on 22.5.1 that is stable.


Have asked AMD on their own community yet about your card?

I still have the feeling you and a couple of people more complains about every driver that gets released no matter if it's beta or whql.

Why don't you use the first driver release ever for the 6000 series cards and keep it there?


----------



## HD64G (Sep 22, 2022)

If your GPU is oced or uved the newer driver could be less CPU bottlenecked and thus pushing more the GPU and causing the oc to not be as stable as previously. Just a thought.


----------



## GerKNG (Sep 22, 2022)

22.9.1 finally fixes enhanced sync after literally years. 
never thought they will ever do it.


----------



## Kabouter Plop (Sep 22, 2022)

HD64G said:


> If your GPU is oced or uved the newer driver could be less CPU bottlenecked and thus pushing more the GPU and causing the oc to not be as stable as previously. Just a thought.



Thats not the issue hardware acceleration is broken in browsers and other apps probably just hardware accelerated video decoding as many websites like YouTube Imgur Reddit have a lot of videos playing, but also video call software etc, 22.9.1 does state there 1 issue with videos being choppy but nothing about stability it self yet.


----------



## P4-630 (Sep 26, 2022)

__ https://twitter.com/i/web/status/1573658256857243648


----------



## Space Lynx (Sep 26, 2022)

P4-630 said:


> __ https://twitter.com/i/web/status/1573658256857243648



I just found a 6800 xt brand new for $529.99 too... seriously thinking about getting it. I just feel like rdna3 is going to be sold out for 6 months cause of scalpers and fuck all third party sellers trying to make a buck.

i think i am going to do it. I actually used his ram calculator a lot back in the day to help me overclock ram... mad respect.



GerKNG said:


> 22.9.1 finally fixes enhanced sync after literally years.
> never thought they will ever do it.



have faith in the red team!!!  NEVER SURRENDER TO THE DARK SIDE! NEVER!!!!


----------



## GamerGuy (Sep 29, 2022)

P4-630 said:


> __ https://twitter.com/i/web/status/1573658256857243648


Yep, saw this 2-3 days back, thought of linking it here, but it's great to see there're some who updating others of what the RX 6800XT can do.


----------



## Kabouter Plop (Oct 6, 2022)

I just leave this here.

__
		https://www.reddit.com/r/Amd/comments/xvtn2u


__
		https://www.reddit.com/r/Amd/comments/xea47d

22.5.2 and up all have issues
and on Radeon subreddit you often have people report blackscreens and being told to go back 22.5.1 which always solves the issue for them.

Hope AMD is not gonna decide to give up getting dx11 and opengl improvements stable, but i would like a stable driver over performance uplift that only makes my gpu trip over it's own feet.
downclocking gpu does not resolve blackscreens for me.
The only thing that may help that if not really tested yet is not using 2e screen it seems dual screen setup with differences in refresh rates having a listed issue for 22.10.1


----------



## 3x0 (Oct 6, 2022)

I've been very lucky not to experience black screens related to hardware acceleration with my 6600XT. Although I've had a weird issue with screen flicker in certain games in certain situations with 22.5.1, which was greatly diminished with later versions. Here's hoping all major issues get resolved with 22.10.2 as they've noted in their driver changelog for 22.10.1


----------



## puma99dk| (Oct 6, 2022)

3x0 said:


> I've been very lucky not to experience black screens related to hardware acceleration with my 6600XT. Although I've had a weird issue with screen flicker in certain games in certain situations with 22.5.1, which was greatly diminished with later versions. Here's hoping all major issues get resolved with 22.10.2 as they've noted in their driver changelog for 22.10.1



The black screen is tiring and takes a lot of work to troubleshoot even for AMD because not 2 systems are the same they are built with different parts and be aware that @Kabouter Plop always complains with others that runs a RX 6900/6950 XT and they are mostly OC'ed and watercooled with a block not a AIB solution.

I haven't personally had any black screens with my current 1440p monitor the last I had was with my 4K screen and Nvidia G-Sync so Asus replaced it.

So far my RX 6800 XT Red Devil runs like a dream with Radeon Chill because electricity prices suxx so I am down to about 150-180W in Saints Row Reboot and it's fine no lag, hiccups or anything even running up to 90fps with Radeon Chill on and the weird 60fps default that can happen has also gone away.


----------



## Kabouter Plop (Oct 14, 2022)

May have found the source of my blackscreens but i am not exactly 100% sure yet on what could be the problem, its whatsapp for sure but it may be a combination of having a logitech brio + broken whatsapp that is triggering it, so this may explain why some users have no problems, its a 4k webcam i discovered the problems around the time i just got new webcam and upgraded to 22.7.1 i got my webcam on 13 juli 22.7.1 came out late juli i believe, i could've sworn i remember 22.6.1 being stable as well but then confirmed it to be unstable back when it originally released i had no blackscreens.


----------



## GreiverBlade (Oct 14, 2022)

Kabouter Plop said:


> I just leave this here.
> 
> __
> https://www.reddit.com/r/Amd/comments/xvtn2u
> ...


Still zero issues of the sort on my side

Just updated to the latest 2days ago when I came back from holidays
2 screen 1 1080p60 1 1620p75
No downclock or undervolting 

MY webcam for WhatsApp is a oldie 720p max tho


----------



## Kabouter Plop (Oct 14, 2022)

GreiverBlade said:


> Still zero issues of the sort on my side
> 
> Just updated to the latest 2days ago when I came back from holidays
> 2 screen 1 1080p60 1 1620p75
> ...



Get a logitech brio get same headaches as me then switch to signal and be surprised you dont blackscreen anymore 
Shame you do not life anywhere near me else be fun to reproduce my theory with another rig and my webcam


----------



## outpt (Oct 15, 2022)

Just replaced my evga 970(from a couple of century’s gone by) with a power color red dragon rx 6800 I had no idea what I had been missing. It is no slouch. Will be a while before it gets replaced. Had a xfx 5600xt and sold it just as crypto was hitting it’s stride big mistake and had to use my backup card for a while. Very happy with the purchase. When I purchased the 5600xt it was when rdna  first come out and there were driver issues to say the least, but amd took care of the problems. Haven’t had the 6800 that long but it seems to be very stable.


----------



## puma99dk| (Oct 15, 2022)

outpt said:


> Just replaced my evga 970(from a couple of century’s gone by) with a power color red dragon rx 6800 I had no idea what I had been missing. It is no slouch. Will be a while before it gets replaced. Had a xfx 5600xt and sold it just as crypto was hitting it’s stride big mistake and had to use my backup card for a while. Very happy with the purchase. When I purchased the 5600xt it was when rdna  first come out and there were driver issues to say the least, but amd took care of the problems. Haven’t had the 6800 that long but it seems to be very stable.



The RX 6800 is a really great card I had one the AMD reference card really nice but once you tried the RX 6800 XT you won't go back


----------



## sam_86314 (Oct 15, 2022)

outpt said:


> Just replaced my evga 970(from a couple of century’s gone by) with a power color red dragon rx 6800 I had no idea what I had been missing. It is no slouch. Will be a while before it gets replaced. Had a xfx 5600xt and sold it just as crypto was hitting it’s stride big mistake and had to use my backup card for a while. Very happy with the purchase. When I purchased the 5600xt it was when rdna  first come out and there were driver issues to say the least, but amd took care of the problems. Haven’t had the 6800 that long but it seems to be very stable.





puma99dk| said:


> The RX 6800 is a really great card I had one the AMD reference card really nice but once you tried the RX 6800 XT you won't go back


I've had a PowerColor RX 6800 XT for a few months now, and it's a great card. I upgraded from an RX 5700 XT. Plenty of power (though even heavily modded Skyrim will bring it to its knees). I love being able to have Blender's Cycles viewport in almost real-time (and that's without hardware raytracing support, which will be added for AMD cards in Blender 3.5).

I've tuned mine for quiet operation, and I'm actually amazed at how much more efficient it is than the 5700 XT. It draws only about 40-60W more for double the performance. I limit mine to 240W.

AMD's Windows drivers still kind of suck (RuneScape in particular is a terrible experience on AMD cards in Windows; it's great on Linux).

Of course, now you can get a 6900 XT for the same price I paid for my 6800 XT (about $680 before tax). Oh well...


----------



## Saschman73 (Oct 16, 2022)

Hi! 
I have a PowerColor Radeon RX 6900 XT Red Devil (AXRX 6900XT 16GBD6-3DHE/OC) converted to water cooling in my PC. 
Does anyone know where I can find a description of which thermal pads (thickness, dimensions, position) I need if I want to reinstall the original air cooler. 
Preferably with pictures if there is such a thing. 
Thank you for your help! 
Greetings from Vienna!


----------



## Kabouter Plop (Oct 16, 2022)

Fully stable until today when i was alt tabbed and insta blackscreened, and then for some reason found out i had a video playing constantly in my browser, so i went back again to 22.5.1 oh well 22.10.2 may come out tomorrow after tomorrow since a lot of new games launch usually new driver launch as well.


----------



## kapone32 (Oct 16, 2022)

Saschman73 said:


> Hi!
> I have a PowerColor Radeon RX 6900 XT Red Devil (AXRX 6900XT 16GBD6-3DHE/OC) converted to water cooling in my PC.
> Does anyone know where I can find a description of which thermal pads (thickness, dimensions, position) I need if I want to reinstall the original air cooler.
> Preferably with pictures if there is such a thing.
> ...


Why not use the ones that come with the card? The cost of good thermal pads is not high either. Usually the pads are between .5 to 1 mm in thickness on the front. They can be as much as 2mm on the back. If you get something like this you can measure what you need..



			https://www.amazon.ca/gp/product/B096ZNHY8F/ref=ppx_yo_dt_b_asin_title_o09_s00?ie=UTF8&psc=1
		




kapone32 said:


> Why not use the ones that come with the card? The cost of good thermal pads is not high either. Usually the pads are between .5 to 1 mm in thickness on the front. They can be as much as 2mm on the back. If you get something like this you can measure what you need..
> 
> 
> 
> https://www.amazon.ca/gp/product/B096ZNHY8F/ref=ppx_yo_dt_b_asin_title_o09_s00?ie=UTF8&psc=1


The package in the waterblock should also give you all you need. Even Byiski give you everything you need.


----------



## sepheronx (Oct 16, 2022)

puma99dk| said:


> The RX 6800 is a really great card I had one the AMD reference card really nice but once you tried the RX 6800 XT you won't go back


I'm planning to sell my 6800xt since I'm using a 3080 now.

Good card but I myself just hated AMDs drivers except pro ones


----------



## outpt (Oct 16, 2022)

What is the advantages of the pro drivers


----------



## Kabouter Plop (Oct 16, 2022)

sepheronx said:


> I'm planning to sell my 6800xt since I'm using a 3080 now.
> 
> Good card but I myself just hated AMDs drivers except pro ones



Blackscreens and driver crashes ?



outpt said:


> What is the advantages of the pro drivers


They supose to be more stable, but if had my PC freeze on radeon pro Q3 driver however it was more stable then all the optional drivers combined.


----------



## sepheronx (Oct 16, 2022)

Kabouter Plop said:


> Blackscreens and driver crashes ?
> 
> 
> They supose to be more stable, but if had my PC freeze on radeon pro Q3 driver however it was more stable then all the optional drivers combined.


Driver crashes. Hell, I can be gaming for hours and it's fine but sometimes just watching a youtube video and it crashes. I replicated it on a 6600 and a 6800xt.

Some drivers work great and others don't. So I had to roll back drivers. Latest were OK for me.

With nvidia I don't face as many issues. I do with some titles where a restart of the game works fine but yeah.

I am hoping next amd gpus will come with even better driver development because they are much further now than before.


----------



## GreiverBlade (Oct 17, 2022)

it's kinda funny (and sad at the same time) that user experience vary with the same drivers ... the worse unstables drivers i ever had since i started build my own PC (roughly 24yrs ago, K6 2 and Tnt 1 ) were with Nvidia, i constantly had to roll back at least 3 month each time they released a new driver  (black screen, freezes, BSOD, et caetera) ... AMD/ATI? zilch, nada, nope, not a single roll back or sever issue from a Rage Pro LT till todays RX 6700 XT

not even watchin youtube video on the main rig or playing a game ... i had a little hiccup launching Doom Eternal first time, were the game would say there was not physical graphic card if trying to run Vulkan first but that was the games fault and not drivers fault (since resinstalling the game worked instantly)

ofc every rig have a lot of variable (mainly hardware but also users) thus issues can hapen with some, where there is none for others... and's polarizing even among my friends and i,  although the majority are fine, like me, with AMD/ATI for years, while some others had so much issues that they switched to Nvidia for that reason (whereas my only Nvidia Switches, GTX 460, then GTX 770, then GTX 980, then GTX 1070 were because of prices, supply and sometime giveaway win but never driver issues  )

next i'm going to upgrade my webcam ... 139chf Logitech Brio is not a biggie, since it's the last issue i can't reproduce (because of lack of hardware not because of lack of crashe  ) and that bog basic 720p, also from Logitech, webcam is starting to show her age (iirc she's more than 12 yrs old  )  



sepheronx said:


> Hell, I can be gaming for hours and it's fine but sometimes just watching a youtube video and it crashes.


if it was the opposite i would understand  (crashing constantly while gaming and being able to watch youtube just fine )


----------



## puma99dk| (Oct 17, 2022)

Well every system is unique even with the same parts because there are always variables and issues can be hard or software.

The videoplayback in browsers a lot of people can have problems depending on the browser, I had a black screen once with my LG monitor turning it off while a game was running because I needed to do something else for 5mins usually I would have had it on but not with these electricity prices.

I wouldn't see anything in the event log for application or system so yeah that might just be my monitor that needed a signal reboot from the source everything else works fine no hiccups, freezes or weird behavior on my card and I am running like a week or max 2 old Windows 11 Pro 22H2 installation here.

For browsers I am using Brave & Firefox never had a videoplayback issue only issue is files not supported in the browser but that's not a driver problem 



sepheronx said:


> I'm planning to sell my 6800xt since I'm using a 3080 now.
> 
> Good card but I myself just hated AMDs drivers except pro ones



Why not a Intel ARC A750/770? The prices seems fine and in 5 years or so it might be a card that will be remembered as Intel's first try no one really wanted and but still wanted but didn't buy or they get rare as heck and goes up in price and they support avc1 which is nice.

If I had the money just to burn I don't because of the war in Ukraine and inflation I would love to have the Intel ARC A770 just as a collection item like my Sapphire Radeon RX 590 Nitro+ Special Edition.


----------



## GreiverBlade (Oct 17, 2022)

puma99dk| said:


> For browsers I am using Brave & Firefox never had a videoplayback issue only issue is files not supported in the browser but that's not a driver problem


hilariously i use Opera GX and never had any issues (or with any Chromium webkit based browser) aside file not supported too  but i had it with Nvidia as with AMD/ATI (well duh... as you said, not a driver problem  )


----------



## AusWolf (Oct 17, 2022)

sepheronx said:


> Driver crashes. Hell, I can be gaming for hours and it's fine but sometimes just watching a youtube video and it crashes. I replicated it on a 6600 and a 6800xt.
> 
> Some drivers work great and others don't. So I had to roll back drivers. Latest were OK for me.
> 
> ...


I've recently learned with my 6500 XT that the first thing for everybody with an AMD GPU to do is to disable hardware acceleration in the browser and video player apps.


----------



## Kabouter Plop (Oct 17, 2022)

When i get blackscreen my 2e screen freezes still displays an image mainscreen an LG ultrawide 3840x1600 @ 144hz bassicly blacksout it does not go to standby it just stays black until it recovers and if it does the game is still running as well, from my experience a gpu driver crash should cause the game to crash as well it does not, if just read some one that had 3 differen't gpu's and had same driver issues with each, if even heard stories of some one buying a 6950XT and having issues and returning it cos confirmed he's fears of bad drivers.


----------



## RJARRRPCGP (Oct 17, 2022)

No crashes here, so far! The last crashes I had, was with a Navi10, because the GPU core clock frequency was too high, when OC'ing it. (RX 5600 XT)


----------



## AlwaysHope (Oct 17, 2022)

Enjoying 0 rpm fan & >90w Total power consumption + >60C for hot spot temp ever, on my Asus TUF Gaming RX 6800 XT OC edition when playing _Oblivion game of the year edition deluxe _for the last month or so now with 22.7.1 working flawlessly.... The fps are maxed of course @1440/144hz monitor. I only have 3 mods active & none of them have to do with graphics.


----------



## Saschman73 (Oct 17, 2022)

kapone32 said:


> Why not use the ones that come with the card? The cost of good thermal pads is not high either. Usually the pads are between .5 to 1 mm in thickness on the front. They can be as much as 2mm on the back. If you get something like this you can measure what you need..


I'm looking for the dimensions of the pads for installing the original air cooler!


----------



## GreiverBlade (Oct 17, 2022)

Saschman73 said:


> I'm looking for the dimensions of the pads for installing the original air cooler!


measure the original one that came with the card after removing the air cooler? you did keep them, right? or were they in such a bad shape that it was not possible to do so?

... well i could disassemble my card and do it, but i don't really need to do so ... 

when i water cooled my R9 290 i kept the pads between 2 sheet of cellophane in an airtight box  (although i sold it with the block on and kept the reference blower cooler for collection purpose  )
aftermarket thermal pads usually come in bigger piece than needed that you cut to the right size
also about placement ... 1 pad adequately sized per vRAM chip and a strip over the phase MosFETs (2 location )


----------



## Saschman73 (Oct 17, 2022)

GreiverBlade said:


> you did keep them, right? or were they in such a bad shape that it was not possible to do so?


I wish I had picked them up! 
When disassembling, the pads stuck so hard to the components that I destroyed them.


----------



## Kabouter Plop (Oct 17, 2022)

I still have mine stuck on the cooler not single one stuck to my gpu did not clean the thermal paste either yet probably should but im not planning on selling, keeping it as a spare just in case, i probably upgrade rx7000 series next year despite having driver issues


----------



## GreiverBlade (Oct 17, 2022)

Saschman73 said:


> I wish I had picked them up!
> When disassembling, the pads stuck so hard to the components that I destroyed them.


ok i see why ... hum ... let me see

if i find something relevant i edit (outside contacting powercolor on their website to ask it  )

one option would be the Thermal Grizzly Minus 8 Thermal Pads iirc 1mm for the vRAM and 1.5/2mm for the MosFET, they come in various form (strips and squares)








						Thermal Grizzly High Performance Cooling Solutions - Minus Pad 8
					

Hochwertige Wärmeleitlösungen für Computerchips




					www.thermal-grizzly.com


----------



## RJARRRPCGP (Oct 17, 2022)

Athlonite said:


> Oh OK I don't seem to have that problem even with 1207


I have AGESA Combo V2 PI 1.2.0.6b (UEFI-BIOS 2.20, from March) on my ASRock B550 PG Velocita. There's now a brand new BIOS for October! (2.30) But, I don't want to flash it now! I fear having to redo everything again, because of Secure Boot!

2.30 has 1.2.0.7, according to ASRock. And, I just wiped the Western Digital Black SN850 1 TB SSD and installed Windows 11 22H2 already on September 20!


----------



## Valantar (Oct 18, 2022)

AusWolf said:


> I've recently learned with my 6500 XT that the first thing for everybody with an AMD GPU to do is to disable hardware acceleration in the browser and video player apps.


Does the 6500 XT even have a hardware decoder block? Or is it just the encoders that are missing? I've never had any trouble with hardware decoding on any of my AMD GPU systems (6900 XT, 6600, 4650G), but I guess YMMV.


----------



## GreiverBlade (Oct 18, 2022)

AusWolf said:


> I've recently learned with my 6500 XT that the first thing for everybody with an AMD GPU to do is to disable hardware acceleration in the browser and video player apps.





Valantar said:


> Does the 6500 XT even have a hardware decoder block? Or is it just the encoders that are missing? I've never had any trouble with hardware decoding on any of my AMD GPU systems (6900 XT, 6600, 4650G), but I guess YMMV.


indeed the  6500 XT only lack, like the RX 6400, encoding hardware iirc

decoding? should be no issues ... my 6700 XT behave just fine, it's not the first thing for everybody i guess 
and since i maybe want a RX 6400 LP for one of my rig, i asked around and found ... that it works for decoding no real issues on a setup like mine (i7-3770 and Q77 express mobo ) browser (Opera GX/Edge) or video player (VLC/KMP64)


----------



## AusWolf (Oct 18, 2022)

GreiverBlade said:


> indeed the  6500 XT only lack, like the RX 6400, encoding hardware iirc
> 
> decoding? should be no issues ... my 6700 XT behave just fine, it's not the first thing for everybody i guess
> and since i maybe want a RX 6400 LP for one of my rig, i asked around and found ... that it works for decoding no real issues on a setup like mine (i7-3770 and Q77 express mobo ) browser (Opera GX/Edge) or video player (VLC/KMP64)


Indeed it does have a decoder. 

The problem isn't decoding itself. Sometimes when I quit a video, the system turns into a stuttery mess, even the windows desktop. Only a restart helps.

Oddly enough, I have a 6400 with a R3 3100 in one of my HTPCs, which doesn't have the same problem. It might be a systemic thing. Maybe my 6500 XT isn't entirely happy with the B560 Rocket Lake platform. Or maybe it's the fact that I've had nvidia cards in the system with only a DDU run on the same old windows installation.


----------



## GreiverBlade (Oct 18, 2022)

AusWolf said:


> Oddly enough, I have a 6400 with a R3 3100 in one of my HTPCs, which doesn't have the same problem. It might be a systemic thing. Maybe my 6500 XT isn't entirely happy with the B560 Rocket Lake platform. Or maybe it's the fact that I've had nvidia cards in the system with only a DDU run on the same old windows installation.


my main rig is an old install that steemed from Win 7 till 10 passing by 8.1 mixed thru the years between AMD and Intel CPUs with multiple AMD and Nvidia cards and only DDU cloned only once since on ~ 9 years out of 11 i had a OCZ Vertex 3 120gb (yep 9yrs with a Vertex 3 as main OS drive  talk about lucksack syndrome  ) and then a Samsung 970 Evo 240gb, last swap from GTX 1070 to the curent RTX 6700 XT i had to run DDU twice and update the driver to the latest using Adrenalin, because TPU driver weren't the last one, aside some weird performances in games during the first driver install, it has been smooth sailing up to 22.10.1

now i am pondering, since i need to upgrade my webcam either way, to take a Logitech Brio to see if it's what cause Kabouter's problems in Whatsapp (in the case it would do it ... i would put the fault on the webcam not on the drivers or the card  ) although a 2.5k camera would be enough for me (and half the price of the Brio )


----------



## puma99dk| (Oct 19, 2022)

@GreiverBlade so was my old installation on my Samsung 970 EVO Plus, but it my new Gigabyte M30 1TB NVME SSD with 2GB of cache I did a clean install of Windows 11 Pro 22H2 and wow it's fast and transferring with the 2GB cache


----------



## Kabouter Plop (Oct 23, 2022)

New 22.10.2 driver whatsapp video calls behave even worse first time if seen it be this laggy like 3 fps across desktop and then blackscreening, it seems as long as i do not run whatsapp desktop im fine, if not noticed if i am still having alt tab blackscreens usually those only happens every couple of days i have to wait and see, still getting alt tab lag tho or glitches.
still testing 22.10.2 as it has a lot of fixes but also new listed issues.


----------



## outpt (Oct 25, 2022)

At what point do I need to be concerned about the hotspot temperature ? Just noticed that hwinfo is showing 99C. Have had no crashes and the gpu doesn’t appear to be throttling going by afterburner readings. The card goes to 79-80C and no higher.


----------



## Super Firm Tofu (Oct 25, 2022)

outpt said:


> At what point do I need to be concerned about the hotspot temperature ? Just noticed that hwinfo is showing 99C. Have had no crashes and the gpu doesn’t appear to be throttling going by afterburner readings. The card goes to 79-80C and no higher.



I believe that 105c is throttle, and 115c is shut down for hot spot.  You might consider either raising fan curve, a slight undervolt, or both to help a bit.


----------



## Kabouter Plop (Oct 25, 2022)

Tested 22.10.2 for a while now they are not stable and not even close its as if the longer i have them installed the more problems i get, as if shader cache is corrupting it self over time and causing problems except reseting it changes nothing.

Worst driver for whatsapp video calls makes entire desktop super laggy, visible lag even if little can cause anything to blackscreen even without whatsapp desktop.

AMD rep told me they think this is a Microsoft issue and they forwarded my kernel dump to Microsoft saying the video scheduler is broken, what makes no sense tho is why 22.5.1 is fine but 22.5.2 and up is not.


----------



## AusWolf (Oct 25, 2022)

Kabouter Plop said:


> Tested 22.10.2 for a while now they are not stable and not even close its as if the longer i have them installed the more problems i get, as if shader cache is corrupting it self over time and causing problems except reseting it changes nothing.
> 
> Worst driver for whatsapp video calls makes entire desktop super laggy, visible lag even if little can cause anything to blackscreen even without whatsapp desktop.
> 
> AMD rep told me they think this is a Microsoft issue and they forwarded my kernel dump to Microsoft saying the video scheduler is broken, what makes no sense tho is why 22.5.1 is fine but 22.5.2 and up is not.


Is there a way to disable video hardware acceleration in WhatsApp? Is this present in other programs too?


----------



## Kabouter Plop (Oct 26, 2022)

AusWolf said:


> Is there a way to disable video hardware acceleration in WhatsApp? Is this present in other programs too?



No if tryit given them feedback via whatsapp desktop beta insisting they add option to disable it and they just pretend to be tech support rather then take feedback like they intended, ignoring feature request, even more annoying technically whatsapp supose to be able allow video calls on web vision as well the web page, but it does not, even tho alternatives do let you like google meet

Pretty sure at this point they forcing enchanced sync enabled by default and when turned off it behaves like regular vsync, they did this intentionally to cause more feedback.
Now on 22.10.2 enchanced sync behaves like this.
Alt tab out fps locked at monitor refresh rate regular vsync
Alt tab back in fps exceeds monitor refresh rate. enchanced sync flips back on behaving like normal
Play a video on main screen fps locked at refresh rate regular vsync, video playing on 2e screen regular enchanced sync.
recording with amd relive fps locked at refresh rate again.

They did not fix enchanced sync, they put a bandaid on and think they fixed it, instead they plague'ng users now with blackscreens that used to be triggered from enchanced sync, to top things off they think the video scheduler is broken and forwarded my dump file to Microsoft.

It would not surprise me if i enabled enchanced sync on 22.5.1 i get blackscreens again doing whatsapp video calls, maybe even under 2 minutes with flip model optimizations turned on.

Found a bandaid for whatsapp video crashing.
If mentioned before but if you record on desktop your fps is limited to refresh rate when this happens whatsapp desktop does not blackscreen under 2 minutes despite having flip model optimisations turned on, i have to find out if it does under 3 hours still tho, things if also noticed and im bit slow to realize this cos i noticed this before.
When i moved windows across the screen it was laggy and not smooth, so i tryit recording desktop to show the bug but instead of recording unsmooth window dragging it became smooth again, now today i noticed if recording its locked at 144 fps.
So my theory of AMD forcing enchanced sync state on all the time may be correct, the only thing the setting now does probably is put it back at vsync state but if its disabled it does still crash, so my guess is this switch in the control panel is also broken.

People that have instant replay turned on all the time + desktop recording on probably have no issues for this very reason.

40 minutes in no blackscreen this is starting to look good.






Awkward 




If had blackscreens on nvidia gpu as well but was easy to avoid.
Have prefer maximum refresh rate turned on in nvidia control panel.
Fire up dark souls 3
Game wants 60 hz, but driver want 144hz
Driver and game battle it out for a couple of seconds.
Sometimes driver loses you get blackscreen.
Sometimes nothing happens game fires up properly.

Moral of the story, some apps do not respect the driver configuration and will fight for it, and if the driver loses in its configuration battle the synchronisation between the display and the gpu freezes up displaying the last color it probably received in some cases black, driver time outs are also not detected, so i guess this is also a Microsoft issue.
Apps should not be able to fight with driver, about configuration, and its probably due to poor coding in a incompatible app.
However things Nvidia does different, if an configuration has problems, that configuration is not allowed for the app.

AMD by the looks of it does not do this properly yet, altho blackscreens used to happen with enchanced sync they made it i guess such way that if a video is played the enchanced sync behaviour just sets it to regular vsync limiting fps to 144 on a 144hz display, or when recording, this is why i never managed to record the drag lag or window dragging been slightly less smooth or not smooth at all.

AMD should be able to detect what apps run when blackscreen occurs, and blacklist incompatible apps from using enchanced sync and only regular vsync, or just force the app to crash instead until the software team of that app catches on and fixes their broken app, and not give users like me headaches.
And AMD should relist enchanced sync as being broken cos its obvious this led to these issues, i doubt it will be fixed until Microsoft codes something into directx to support technologies like fast sync and enchanced sync properly, fast sync is not working either sometimes causes games to crash, but i would settle over that then over a blackscreen.
But i never would settle for a 12 pin plug that overheats and melts.

3 hours and no blackscreen its getting late i will test bandaid on 22.7.1 tommorow, if left instant replay on now, gonna check if forcing vsync can fix it as well without desktop recording bandaid, in theory video playback on a loop could fix it as well as long its on screen on the foreground.


----------



## Kabouter Plop (Oct 28, 2022)

Found a better fix now, i limit fps globally via rtss so it never exceeds 140 fps, cos any kind of vertical synchronisation / enchanced sync only engaged at 144 fps or above.
and i aplied MPO fix that Nvidia was plagued by, appearntly that fixed a lot of issues for some one else on another forum, whatsapp when resizing no longer behaves like its drag lagging but neither does steam which appearntly had similar behaviour as well as discord its as if all hardware acceleration issues got fixed, probably due MPO fix, so now also AMD drivers suffer from it but appearntly only after 22.5.2 which gradually got worse each driver update.

I think despite that tho synchronisation is still broken so i limited my fps for now at 140 fps on my 144hz display, these issues been pissing me off so much that if been considering switching Nvidia, but with their new 12 pin i would rather not.
Daisy chaining 12 pins to 4 8 pins if 1 disconnects it will draw 600w over 3 8 pins, and if 2 disconnects then 600w over 2 8 pins and im willing to bet at 3 disconnected pins it heats up enough to melt the connector from the inside out. as it acts like a heatsink, so i rather not touch a nvidia gpu anytime soon.
Hope my drivers are now stable, could't care much if AMD decides to report this or not, more people have driver issues the more will complain, and more trolls claiming its just hardware when its just their drivers not handling MPO anymore.

Amazing this just stole focus on my screen i guess they are having their drivers coded by cheap labor just like Asus does with Armoury crate.


http://imgur.com/4USwijF

Dont mind the extreme brightness that is HDR, what matters is the automatic updatep opping up and stealing focus randomly.


----------



## GreiverBlade (Oct 29, 2022)

what the? enhanced sync is off by default    (weird one would be that my monitor is not Freesync compatible but Adrenalin report "AMD Optimized"in the toggle setting )

updated to 22.10.2 hardware acceleration is on (not that i had issues on 22.10.1 ) i got a blackscreen! during driver install ... so thats normal ... 

i don't use auto update it seems because i never get that popup ... well i have update check on but download update off

issues with enhanced (you did manage to write it as enchanced in the Gru meme tho but typo can be systematic reflex, i am also prone to that. ) sync happen above 140 fps? no wonder i would not see them ... most my games rarely exceed 100fps and my screen is 60/75hz (i am part of the kind of people where anything above 100hz yield zero improvement )

we need a bit of positivity  who else her is also a lucksack and is having a blast with their card and latest drivers with no issues? 
maybe linking hardware and software used on a daily basis so we could find the culprit plaguing those who have issues, because to me it seems that it is due to underlying select condition Whatsapp seems to be one, so it could be Whatsapp coding who would be in fault, not 100% sure since i do not have that issue, youtube HWA same.


----------



## AusWolf (Oct 29, 2022)

GreiverBlade said:


> what the? enhanced sync is off by default    (weird one would be that my monitor is not Freesync compatible but Adrenalin report "AMD Optimized"in the toggle setting )
> 
> updated to 22.10.2 hardware acceleration is on (not that i had issues on 22.10.1 ) i got a blackscreen! during driver install ... so thats normal ...
> 
> ...


Well, I don't use Whatsapp on Windows, but I did have issues with Youtube. Turning hardware acceleration off in Chrome fixed it, though.

I am also on 60 Hz with no plans to upgrade.


----------



## AlwaysHope (Oct 29, 2022)

Ever since I got my RX 6800 XT, been playing DX9 games...& now probably for longer too, thanks to this mod for _Oblivion    _

All this because of a lack of quality open world RPG these days with newish releases. Seems until Bethesda release Starfield, there isn't' much out there in this genre unless one wants to go with cyberpunk 2077.


----------



## sam_86314 (Oct 29, 2022)

AlwaysHope said:


> Ever since I got my RX 6800 XT, been playing DX9 games...& now probably for longer too, thanks to this mod for _Oblivion   _
> 
> All this because of a lack of quality open world RPG these days with newish releases. Seems until Bethesda release Starfield, there isn't' much out there in this genre unless one wants to go with cyberpunk 2077.


Oh yeah, that's one of the main things I do with my 6800 XT; play older games with mods.

Obviously, stuff like Skyrim or Dragon's Dogma runs great maxed, but once you start throwing graphical mods at them, they starts to push the 6800 XT.

My current Skyrim SE install actually requires me to turn the resolution down to a little over 1080p for playable framerates because of the ENB and texture mods I'm using. Fortunately, RSR is a thing that exists, so the drop in visual quality is insignificant.


----------



## GreiverBlade (Oct 29, 2022)

sam_86314 said:


> Oh yeah, that's one of the main things I do with my 6800 XT; play older games with mods.
> 
> Obviously, stuff like Skyrim or Dragon's Dogma runs great maxed, but once you start throwing graphical mods at them, they starts to push the 6800 XT.
> 
> My current Skyrim SE install actually requires me to turn the resolution down to a little over 1080p for playable framerates because of the ENB and texture mods I'm using. Fortunately, RSR is a thing that exists, so the drop in visual quality is insignificant.


my Skyrim run fine on the 6700 XT well ... fine ish ... as i mentioned once in "what are you playing" 60fps rock solid with a shittons of mods but 2 follower manage to bring it to her knees 
30fps when they are present with their fully HDT SMP hairs  and i play it in 2880X1620 noAA  (well rock solide average 54.8fps but steam fps counter mention 60 more than often  )


other games that i play including Cyberpunk 2077 (even with RT on, performance preset) are just day and night compared to my previous GTX 1070


----------



## Kabouter Plop (Oct 29, 2022)

Seems MPO is fixing it for more users same BS that happened to Nvidia same issues more or less as well, i wonder how long it takes for AMD to report this issue and what advice they gonna give to fix or will fix it them self by forcing MPO disabled during future driver installs.


----------



## GreiverBlade (Oct 29, 2022)

Kabouter Plop said:


> Seems MPO is fixing it for more users same BS that happened to Nvidia same issues more or less as well, i wonder how long it takes for AMD to report this issue and what advice they gonna give to fix or will fix it them self by forcing MPO disabled during future driver installs.


MPO  is used for borderless, mainly? right?


----------



## Wheeljack4219 (Oct 29, 2022)

just ripped out the gtx 1660 super 6gb HP Custom ITX card for a MSI Mech 2x AMD Radeon rx 6600 8gb. Can't wait to do 1440p gaming! TPU's GPU database puts the card about 30 to 40 percent better than the gtx 1660 super. And it's the fastest card I can put into a 400 watt that uses single 8-pin PCI-e power (uses 132 watts according to TPU's GPU database) I could use without killing the psu. TPU's database also ranked it as the highest card to be used with a 300 watt psu.


----------



## Kabouter Plop (Oct 29, 2022)

GreiverBlade said:


> MPO  is used for borderless, mainly? right?



If no idea if ever since i disabled it my blackscreens went away, whatsapp video call can do a call for 8 hours now even on a driver that triggered it under 2 minutes, other things i did is cap fps below refresh rate so no synchronisation like vsync can ever engage etc, but im testing without fps cap now on 10 bit 120hz and with enchanced sync.


----------



## GreiverBlade (Oct 29, 2022)

Kabouter Plop said:


> If no idea if ever since i disabled it my blackscreens went away, whatsapp video call can do a call for 8 hours now even on a driver that triggered it under 2 minutes, other things i did is cap fps below refresh rate so no synchronisation like vsync can ever engage etc, but im testing without fps cap now on 10 bit 120hz and with enchanced sync.


well, i am glad you found a workaround about your issues, now i do hope, too, that AMD will address the issue for all those who have it


----------



## jext (Oct 29, 2022)

Should I repaste GPU? Hotspot temps difference are pretty big ( 110 - 64 ) running DXR feature test in 3DM, overclocked ?


----------



## Valantar (Oct 30, 2022)

jext said:


> Should I repaste GPU? Hotspot temps difference are pretty big ( 110 - 64 ) running DXR feature test in 3DM, overclocked ?


Is that with the stock cooler and paste? That's completely unacceptable, both in terms of the avg/hotspot delta (40°C delta is way too much) and in absolute temperatures (that hotspot temp is reaching its throttle point).

If it's stock, I would RMA the card - that isn't acceptable stock operations, and speaks to either a badly designed cooler or poor QC. If not, then yes, definitely repaste, but also check thermal pad thicknesses - excessive pad thickness can cause poor contact.


----------



## Kabouter Plop (Oct 30, 2022)

GreiverBlade said:


> well, i am glad you found a workaround about your issues, now i do hope, too, that AMD will address the issue for all those who have it


Appearntly i read somewhere on this forum as it was linked that if certain key is missing
its enabled.
[HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\Dwm]
"OverlayTestMode"=dword:00000005

You can copy HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\Dwm into navigation bar of regedit to quickly navigate to it.
MPO enabled tho gives reduced latency tho so i guess this is what is causing problems.

Nvidia driver mentioned forcing vsync off was big no no stuttering and flickering issues etc where also present with Nvidia driver.
Guess what issues been known issue list for quite some time now, and some still report issues despite AMD claiming they fixed it.


----------



## jext (Oct 30, 2022)

Valantar said:


> Is that with the stock cooler and paste? That's completely unacceptable, both in terms of the avg/hotspot delta (40°C delta is way too much) and in absolute temperatures (that hotspot temp is reaching its throttle point).
> 
> If it's stock, I would RMA the card - that isn't acceptable stock operations, and speaks to either a badly designed cooler or poor QC. If not, then yes, definitely repaste, but also check thermal pad thicknesses - excessive pad thickness can cause poor contact.


Thanks, it's with stock cooler and paste, I am just hesitant to do anything because this only happens with that app.  There is still a 30 degree delta (between gpu 'temp' and hotspot) in timespy and timespy extreme but max hotspot temp doesn't exceed 95

Edit: attached screenshot after timespy run


----------



## Kabouter Plop (Oct 30, 2022)

Appearntly 10 bit and 12 bit displays do not support MPO so this may explain why some users have no issues


----------



## AlwaysHope (Oct 30, 2022)

sam_86314 said:


> Oh yeah, that's one of the main things I do with my 6800 XT; play older games with mods.
> 
> Obviously, stuff like Skyrim or Dragon's Dogma runs great maxed, but once you start throwing graphical mods at them, they starts to push the 6800 XT.
> 
> My current Skyrim SE install actually requires me to turn the resolution down to a little over 1080p for playable framerates because of the ENB and texture mods I'm using. Fortunately, RSR is a thing that exists, so the drop in visual quality is insignificant.


I have plenty of hrs up already on _Skyrim SE_, don't know if I could play it again... you know there is a point where you become too familiar with the map & it looses its shine, so to speak. Interesting thing about _Oblivion_, is that I can max out the refresh rate on my 144hz monitor with no breakage of physics unlike with_ Skyrim_ or even its SE version without mods that are finicky & barely stable at best to keep fps higher than 60 without breaking physics. In _Oblivion game of the year edition deluxe ,_  It's cool playing a game with GPU Asic power never exceeding 90w at any point in that game (as measured by HWiNFO) on my last playthrough with only 4 mods. Generally its always around 60w! That's some nice RDNA2 efficiency there with an old DX9 game. Still on 22.7.1 drivers only because they are stable still.

If I scrape the steam specials list, I might splash out on a more modern open world RPG just to get out of DX9 territory!  



Kabouter Plop said:


> Tested 22.10.2 for a while now they are not stable and not even close its as if the longer i have them installed the more problems i get, as if shader cache is corrupting it self over time and causing problems except reseting it changes nothing.
> 
> Worst driver for whatsapp video calls makes entire desktop super laggy, visible lag even if little can cause anything to blackscreen even without whatsapp desktop.
> 
> AMD rep told me they think this is a Microsoft issue and they forwarded my kernel dump to Microsoft saying the video scheduler is broken, what makes no sense tho is why 22.5.1 is fine but 22.5.2 and up is not.


Thanks for the heads up on that. I was going to upgrade from 22.7.1 to WHQL 22.10.2 but after being informed of your experience, it just proves that if something is not broke, don't touch it! That old saying never gets too old in computer world.


----------



## sam_86314 (Oct 30, 2022)

Here's what my Skyrim looks like. This is at 3440x1440 with my GPU at 2150MHz and 240W PL.






I've noticed in this particular area (near Cradlecrush Rock), I seem to be hitting some sort of bottleneck. My FPS in this shot was around 50, and turning ENB off, it goes up to around 65. But my GPU usage drops to like 40%. I tried messing with my GPU power limit and clock, and even going up to 2500MHz made next to no difference to framerate, and GPU usage even dropped to the mid-90s.

VRAM usage was like 11GB because 8K textures lol.


----------



## Hekusaurus (Oct 30, 2022)

What are the thermal pad thicknesses on the no XT variant of 6800 reference card? I am thinking of trying the copper plates on vRAM chips. Temps aren't crazy (+-70*C), but if there is room for improvement I would like to try. 

Have you guys noticed micro stutters on these cards btw? I've had mine for a year and it has always bugged me, I've rolled back drivers, windows, went to linux, underclocked, overclocked, but the problem persists.


----------



## GreiverBlade (Oct 30, 2022)

Hekusaurus said:


> What are the thermal pad thicknesses on the no XT variant of 6800 reference card? I am thinking of trying the copper plates on vRAM chips. Temps aren't crazy (+-70*C), but if there is room for improvement I would like to try.
> 
> Have you guys noticed micro stutters on these cards btw? I've had mine for a year and it has always bugged me, I've rolled back drivers, windows, went to linux, underclocked, overclocked, but the problem persists.


thermal pads, as i  mentioned once : Thermal Grizzly Minus 8 Thermal Pads iirc 1mm for the vRAM and 1.5/2mm for the MosFET








						Thermal Grizzly High Performance Cooling Solutions - Minus Pad 8
					

Hochwertige Wärmeleitlösungen für Computerchips




					www.thermal-grizzly.com
				




stutter, none, i can run an Android emulator alongside a twelve tab Opera GX and playing a heavy game without noticing any stutter, crash or anything else



sam_86314 said:


> Here's what my Skyrim looks like. This is at 3440x1440 with my GPU at 2150MHz and 240W PL.
> 
> 
> 
> ...


fun ... well i use an optimised ENB but at  3k 

no oc nothing fancy

but the backside ... ooohhhh the backside ....

everything move, i have "Skyrim is windy" installed

power draw around 2363Mhz core 1992mhz vRAM 170W and a dip to 57fps in the area around 7.5gb vRAM usage tho

but thanks to you i need to check my lights and weathers mods .... even at weather preset clear 3 on proteus it feels more foggy than i got used to  



AlwaysHope said:


> I have plenty of hrs up already on _Skyrim SE_, don't know if I could play it again... you know there is a point where you become too familiar with the map & it looses its shine, so to speak.


6672hrs and counting (officially 1157.1hrs on LE and 1602.8hrs on SE, and  more since 2011 on non offical build aka : cracked, i owned the game, physical box, but used a cracked version due to some issues with Steam ) i can't get bored of Skyrim, always a new DLC sized mod, always something new, or even total conversion, Morrowind is the same for me and Oblivion ... well, i love Oblivion but less than TESV and TESIII, the refresh rate is not a bother ... 60hz for game like that is already plenty overkill, imho

my skyrim SE use 223 normal mods (ESL/ESM) and 328 lights (ESL flagged) stable, well, i wouldn't break physics even if i uncapped ahah 60 fps is the highest 54.8 is the average and 30 well, the lowest but only appear when Dread and Jeeta are in the field of view 



Kabouter Plop said:


> Appearntly 10 bit and 12 bit displays do not support MPO so this may explain why some users have no issues


my monitor is a 8 bit and i did not disable MPO


----------



## AlwaysHope (Oct 30, 2022)

GreiverBlade said:


> ....
> 
> 
> 6672hrs and counting (officially 1157.1hrs on LE and 1602.8hrs on SE, and  more since 2011 on non offical build aka : cracked, i owned the game, physical box, but used a cracked version due to some issues with Steam ) i can't get bored of Skyrim, always a new DLC sized mod, always something new, or even total conversion, Morrowind is the same for me and Oblivion ... well, i love Oblivion but less than TESV and TESIII, the refresh rate is not a bother ... 60hz for game like that is already plenty overkill, imho
> ...



Yeah, I know there are always mods & some pretty good sounding ones too - at least in the description of them. But the real problem with _Skyrim_ in general is the game engine. I tend to only use mods listed in the top ones of all time for a given game. That way I know plenty of beta testing has already been done before I take a risk with them. There is nothing worse than putting in the hrs to build up a character only to have the game CTD just at a critical point in a quest! This tended to happen almost predictably with the creation engine in either _Skyrim _or _Skyrim SE _. Even with the Unofficial patches - that got annoying because of the constant updating over the yrs - fix one of a few things, break others...


----------



## GreiverBlade (Oct 30, 2022)

AlwaysHope said:


> Yeah, I know there are always mods & some pretty good sounding ones too - at least in the description of them. But the real problem with _Skyrim_ in general is the game engine. I tend to only use mods listed in the top ones of all time for a given game. That way I know plenty of beta testing has already been done before I take a risk with them. There is nothing worse than putting in the hrs to build up a character only to have the game CTD just at a critical point in a quest! This tended to happen almost predictably with the creation engine in either _Skyrim _or _Skyrim SE _. Even with the Unofficial patches - that got annoying because of the constant updating over the yrs - fix one of a few things, break others...


"mod it until it crashes" is my moto ... top ranked or brand news ... no matter to me, well i have way less CTD since i swapped GPU tho  (literally 1 CTD in 670hrs till now )
as for update, i use a downpatcher to 1.6.353.0 since my current modlist is not compatible with 1.6.640.0  but Bethesda should learn to not touch what is not broken ... SE issue started with AE and were induced by the AE update, AE was an insult and totally uncalled for ... they could just have made a few free DLC and keep the previous compiler (i will not buy the AE pack nonetheless, the free update is already enough of a mess  )

currently i do not really care about character build since i can redo it in a matter of minutes, as i play alternate start and i have multiple clean backup that can be used to clear the problems if i ever encounter one.

the RX 6700 XT is what saved my Skyrim  (not that i did mind 30fps average and 45fps max on the GTX 1070  ) well moded skyrim, that is


----------



## Hekusaurus (Oct 30, 2022)

GreiverBlade said:


> thermal pads, as i  mentioned once : Thermal Grizzly Minus 8 Thermal Pads iirc 1mm for the vRAM and 1.5/2mm for the MosFET



Could you refer where you pulled those numbers from? I went through couple of your posts and you suggested the same pads for person having Red Devil card.


----------



## puma99dk| (Oct 30, 2022)

Kabouter Plop said:


> Appearntly i read somewhere on this forum as it was linked that if certain key is missing
> its enabled.
> [HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\Dwm]
> "OverlayTestMode"=dword:00000005
> ...



For Windows 10/11 the path needs to be Computer\HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\Dwm not just HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\Dwm I doubt people with the RX 6000 series is still on XP or something


----------



## GreiverBlade (Oct 30, 2022)

Hekusaurus said:


> Could you refer where you pulled those numbers from? I went through couple of your posts and you suggested the same pads for person having Red Devil card.


Yes this is why I wrote "as I mentioned once"

Usually thermal pads are about the same thickness from one brand to another


----------



## sam_86314 (Oct 30, 2022)

GreiverBlade said:


> 6672hrs and counting (officially 1157.1hrs on LE and 1602.8hrs on SE, and more since 2011 on non offical build aka : cracked, i owned the game, physical box, but used a cracked version due to some issues with Steam ) i can't get bored of Skyrim, always a new DLC sized mod, always something new, or even total conversion, Morrowind is the same for me and Oblivion ... well, i love Oblivion but less than TESV and TESIII, the refresh rate is not a bother ... 60hz for game like that is already plenty overkill, imho


I keep coming back to Skyrim, too. I, of course, use a patched version of the game to avoid Steam update shenanigans, though I patched my Steam copy myself using two open-source software solutions.

Every now and then, I'll wipe out all of my mods and start over.



GreiverBlade said:


> "mod it until it crashes" is my moto ... top ranked or brand news ... no matter to me, well i have way less CTD since i swapped GPU tho  (literally 1 CTD in 670hrs will now )
> as for update, i use a downpatcher to 1.6.353.0 since my current modlist is not compatible with 1.6.640.0  but Bethesda should learn to not touch what is not broken ... SE issue started with AE and were induced by the AE update, AE was an insult and totally uncalled for ... they could just have made a few free DLC and keep the previous compiler (i will not buy the AE pack nonetheless, the free update is already enough of a mess  )



I really hope that the modding community starts supporting the GOG version since that is already safe from forced updates. While I trust the patching software I use, it really shouldn't be necessary.


----------



## Kabouter Plop (Oct 31, 2022)

If anyone still wonders why i have blackscreens and you do not ?
MPO is only enabled on mainscreen, so if you started whatsapp video call with chat window on 2e screen it wont trigger, i just tested this on clean install on 22.7.1 and i did verify first if i blackscreen if i do it on mainscreen.
MPO is also not working on mainscreen if its set to 10 bit or 12 bit, it only supports 8 bit color depth.

MPO is enabled if this key is missing, and if mainscreen is set to 8 bit color depth, and can only trigger blackscreens on mainscreen.

Adding missing key disables MPO


```
Windows Registry Editor Version 5.00

[HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\Dwm]
"OverlayTestMode"=dword:00000005
```

If anyone wants to expirment trying to reproduce blackscreens.
Make sure that key is missing, install windows 11 22h2 enable flip model optimisations this speeds up blackcreens quite fast if you install a driver like 22.7.1 which is the most reliable driver to recover from blackscreens, while being able to trigger blackscreens under 2 minutes, other drivers that can also trigger it under 2 minutes usually do not recover and sometimes even generate a dump file.
Anyway keep in mind MPO is only enabled on mainscreen and if 8 bit color depth is selected.

Whatsapp desktop app from Microsoft store can trigger it under 2 minutes doing a video call quite reliably especially if firefox is also running probably also with chrome running.









						Get WhatsApp from the Microsoft Store
					

WhatsApp from Meta is a 100% free messaging app. It’s used by over 2B people in more than 180 countries. It’s simple, reliable, and private, so you can easily keep in touch with your friends and family. WhatsApp works across mobile and desktop even on slow connections, with no subscription...




					www.microsoft.com


----------



## GreiverBlade (Oct 31, 2022)

Kabouter Plop said:


> If anyone still wonders why i have blackscreens and you do not ?
> MPO is only enabled on mainscreen, so if you started whatsapp video call with chat window on 2e screen it wont trigger, i just tested this on clean install on 22.7.1 and i did verify first if i blackscreen if i do it on mainscreen.
> MPO is also not working on mainscreen if its set to 10 bit or 12 bit, it only supports 8 bit color depth.
> 
> ...


ah, well now i can guess why i do not have issues ... as i have all you mention except Win 11 22H2 ... 

then isn't it a Win 11 issue? not mentioning 22H2 is also riddled with peoples repporting issues about anything and everything?


----------



## AusWolf (Oct 31, 2022)

Kabouter Plop said:


> If anyone still wonders why i have blackscreens and you do not ?
> MPO is only enabled on mainscreen, so if you started whatsapp video call with chat window on 2e screen it wont trigger, i just tested this on clean install on 22.7.1 and i did verify first if i blackscreen if i do it on mainscreen.
> MPO is also not working on mainscreen if its set to 10 bit or 12 bit, it only supports 8 bit color depth.
> 
> ...


Please excuse my lack of knowledge, but what is MPO? Also, is it a Win 11 issue, or does it happen on Win 10 too?


----------



## GreiverBlade (Oct 31, 2022)

AusWolf said:


> Please excuse my lack of knowledge, but what is MPO? Also, is it a Win 11 issue, or does it happen on Win 10 too?


MPO : Multi-Plane Overlay
Nvidia suffered an issue with it, it seems








						[Feature Request] Disable Multiplane Overlay (MPO)
					

Inspired by horrendous 2 months of troubleshooting why my browser and everything changes colors when subject to anything such as a video on a Twitter or Facebook timeline or anything that uses the same codec(?), DDU reinstalls, cable switching, reformats, I would love to have that as a tickbox...




					www.techpowerup.com
				




dunno if it's Win 11 exclusive but as i mention i have everything set up to make it happen but does not happen and i run Win 10 21H2
i did roll back to 22.7.1 to test it and then went back to 22.10.3

it furstrate me a bit, but i can't replicate (and i have no intention to upgrade to Win 11 soon )

i guess i should just be glad my rig run as it should without "Murphy" meddling with it


----------



## Kabouter Plop (Oct 31, 2022)

GreiverBlade said:


> MPO : Multi-Plane Overlay
> Nvidia suffered an issue with it, it seems
> 
> 
> ...



Windows 10 as well





						After updating to NVIDIA Game Ready Driver 461.09 or newer, some desktop apps may flicker or stutter when resizing the window on some PC configurations | NVIDIA
					






					nvidia.custhelp.com
				



It causes a lot more issues actually then Nvidia probably realised corruption as well and i am starting to suspect that it also caused shadoer cache corruption for Nvidia and quite possibly for AMD to altho if not noticed similar bugs like that with MPO enabled yet, reason why i suspect it is because alt tab corruption issue is also MPO related, and if things can corrupt then it can perhaps also cause shader cache to generate with same corruption.


----------



## Hekusaurus (Oct 31, 2022)

I bit the bullet and decided to try win 11 again, but problems are all over again. Anyone found a fix for this? Only happens if I run anything other than windows 1903. Only happens on this amd 6800 reference card. I've gotten new psu, memory, motherboard, new nvme drives, hard disks, case. Nothing has worked.

It causes an small audio stutter every now and then which is especially annoying when listening to music.


----------



## Kabouter Plop (Oct 31, 2022)

Hekusaurus said:


> I bit the bullet and decided to try win 11 again, but problems are all over again. Anyone found a fix for this? Only happens if I run anything other than windows 1903. Only happens on this amd 6800 reference card. I've gotten new psu, memory, motherboard, new nvme drives, hard disks, case. Nothing has worked.
> 
> It causes an small audio stutter every now and then which is especially annoying when listening to music.



What windows build if you do win+r and type winver ?


----------



## Hekusaurus (Oct 31, 2022)

22H2 now (OS 22621.674)
It happens on all versions after that windows 10 ver 1903


----------



## Kabouter Plop (Oct 31, 2022)

there is an optional update i believe 22621.755 





						October 25, 2022—KB5018496 (OS Build 22621.755) Preview - Microsoft Support
					






					support.microsoft.com
				




Maybe give that a try
Also DDU from safe mode then when installing driver also select factory reset cos windows now reinstalls the AMD gpu driver right after.

Give 22.10.3 a try and apply the reg key for disabling MPO see if that latest driver has issues as well.


----------



## Hekusaurus (Oct 31, 2022)

Kabouter Plop said:


> there is an optional update i believe 22621.755
> 
> 
> 
> ...



Tried to disable MPO, but it didn't help. Testing KB5018496, but am doubtful that it'll work.


----------



## Kabouter Plop (Oct 31, 2022)

Something is causing you issues atleast
this is my latency mon
Its also possible you are using an old vision of latency mon which is bugged and miss reporting.



Since i am running 7.20
I remember from windows 7 days the old dpc latency tool stopped working i believe since windows 8 and had similar reporting on windows 8 and 10 heck it looks like this
Smaller window is the tool that miss reports


----------



## Hekusaurus (Oct 31, 2022)

Kabouter Plop said:


> Something is causing you issues atleast
> this is my latency mon
> Its also possible you are using an old vision of latency mon which is bugged and miss reporting.
> View attachment 267964
> ...


Stutter happens even when I am not using latencymon. I DDU'd drivers and installed 'em back in safe mode. Waiting for results now.


----------



## Kabouter Plop (Oct 31, 2022)

Hekusaurus said:


> Stutter happens even when I am not using latencymon. I DDU'd drivers and installed 'em back in safe mode. Waiting for results now.



Alright do update to 7.20 tho for latency mon when checking next time. but check with 7.0 first then update it to 7.20





						Resplendence Software - LatencyMon: suitability checker for real-time audio and other tasks
					

LatencyMon: suitability checker for real-time audio and other tasks



					www.resplendence.com


----------



## Hekusaurus (Oct 31, 2022)

Didn't work. dxgkrnl.sys hitting 28k again.
wdf01000.sys hitting 18k aswell


----------



## Kabouter Plop (Oct 31, 2022)

what does it report when you run powershell as admin and run the command sfc /scannow ?


----------



## Hekusaurus (Oct 31, 2022)

Kabouter Plop said:


> what does it report when you run powershell as admin and run the command sfc /scannow ?


"Windows resource protection did not find any integrity violations."
Some people having the same issue


----------



## 3x0 (Oct 31, 2022)

Hekusaurus said:


> wdf01000.sys hitting 18k aswell


Looking at your previous screenshot, wdf01000.sys is only 12 not 12k. You're looking at the wrong number, Interrupt to process latency is in the thousands range not wdf01000.sys


----------



## Hekusaurus (Oct 31, 2022)

3x0 said:


> Looking at your previous screenshot, wdf01000.sys is only 12 not 12k. You're looking at the wrong number, Interrupt to process latency is in the thousands range not wdf01000.sys


Good point. The problem however persists when playing for an example wow on main screen and playing any video on second screen. One problem unrelated to dxgkrnl.sys might be because of nvme position on motherboard. Might try to change it tomorrow or change something in bios.


----------



## 3x0 (Oct 31, 2022)

Can you try this suggestion first?


oobymach said:


> Run MSI Util v3 (see below) as an admin and check any/all of the boxes that have msi listed under supported modes then hit apply in the top right and shut down then restart, that will help with latency
> 
> 
> 
> ...


----------



## Hekusaurus (Oct 31, 2022)

3x0 said:


> Can you try this suggestion first?


Checked it already. All were on and I've set GPU priority to High


----------



## Kabouter Plop (Nov 3, 2022)

Im surprised even after discovery that setting either 10 bit color depth or disabling MPO there people that seem to believe that its an hardware issue, i seriously hate these type of people, i get it people think like this before this discovery, but after when there even more proof that it is a bug ?


----------



## AusWolf (Nov 6, 2022)

Hi people. 

I'm joining the club (again) with a reference 6750 XT. My first impressions: 1. It's freakin' beautiful. Probably the best looking GPU I've ever owned. 2. It's freakin' hot. GPU temp around 80-85 °C, hot spot around 100-105 °C. Lowering the power target to -6%, or ramping up the case fans for more airflow has no effect. Is this normal?


----------



## 3x0 (Nov 6, 2022)

It's a bit hotter than the 6700XT reference from TPU reviews https://www.techpowerup.com/review/amd-radeon-rx-6700-xt/33.html
I'd say it's on the hotter side but 115°C should be the throttling point and 105°C is within spec. Might want to try undervolting instead of ramping fans up.


----------



## AlwaysHope (Nov 6, 2022)

AusWolf said:


> Hi people.
> 
> I'm joining the club (again) with a reference 6750 XT. My first impressions: 1. It's freakin' beautiful. Probably the best looking GPU I've ever owned. 2. It's freakin' hot. GPU temp around 80-85 °C, hot spot around 100-105 °C. Lowering the power target to -6%, or ramping up the case fans for more airflow has no effect. Is this normal?


Measuring those temps in Adrenalin software suite? What are the ambient temps in the room?


----------



## AusWolf (Nov 6, 2022)

AlwaysHope said:


> Measuring those temps in Adrenalin software suite? What are the ambient temps in the room?


It's in GPU-Z. The room is around 22 °C if my thermostat is correct.



3x0 said:


> It's a bit hotter than the 6700XT reference from TPU reviews https://www.techpowerup.com/review/amd-radeon-rx-6700-xt/33.html
> I'd say it's on the hotter side but 115°C should be the throttling point and 105°C is within spec. Might want to try undervolting instead of ramping fans up.


I might do that. Do you have any experience with the auto undervolt option in Adrenalin? Is it any good?


----------



## 3x0 (Nov 6, 2022)

I usually avoid anything automatic relating to OCing. The way I'd do it is manually decrease the voltage in 50mV steps and test for 1h at least each time. Increase by 25mV when unstable. You could even try increasing the frequency slightly while undervolting, might even eek out more performance while you're at it.


----------



## AlwaysHope (Nov 6, 2022)

AusWolf said:


> It's in GPU-Z. The room is around 22 °C if my thermostat is correct.
> ...


Ok, but do those temps correspond to what HWiNFO indicates? I should have indicated that earlier in my post. IF this is the situation, then I'd be looking at improving air flow inside your case as a first step in troubleshooting those temps.


----------



## AusWolf (Nov 6, 2022)

AlwaysHope said:


> Ok, but do those temps correspond to what HWiNFO indicates? I should have indicated that earlier in my post. IF this is the situation, then I'd be looking at improving air flow inside your case as a first step in troubleshooting those temps.


Do you suspect that GPU-Z reports false temps? I'll check with HWinfo after work.

My case is a Corsair 280X. It's not very good on the airflow department. It's already full of Be Quiet Silent Wings 3 14 cm fans, so the only thing I can do now is taking some panels off.


----------



## Valantar (Nov 7, 2022)

AusWolf said:


> Hi people.
> 
> I'm joining the club (again) with a reference 6750 XT. My first impressions: 1. It's freakin' beautiful. Probably the best looking GPU I've ever owned. 2. It's freakin' hot. GPU temp around 80-85 °C, hot spot around 100-105 °C. Lowering the power target to -6%, or ramping up the case fans for more airflow has no effect. Is this normal?


That sounds a bit high to me. Doesn't the 6750 XT use the 6800 cooler, the two-slot, three-fan one (rather than the two-fan one for the 6700 XT)? I would definitely have expected lower temps than that, seeing how that cooler manages the 6800 very nicely. Their TDPs are the same, and the 6750 XT is a smaller die, so I'd expect it to run a bit hotter, but that's definitely more than I'd expect.

Also: welcome to the dual-wield RDNA2 club!


----------



## AusWolf (Nov 7, 2022)

Valantar said:


> That sounds a bit high to me. Doesn't the 6750 XT use the 6800 cooler, the two-slot, three-fan one (rather than the two-fan one for the 6700 XT)? I would definitely have expected lower temps than that, seeing how that cooler manages the 6800 very nicely. Their TDPs are the same, and the 6750 XT is a smaller die, so I'd expect it to run a bit hotter, but that's definitely more than I'd expect.
> 
> Also: welcome to the dual-wield RDNA2 club!


Thanks.  

That's what I thought too... that it's okay but a bit hotter than what I'd expect. I know that my case's greatest side isn't airflow, but I don't know...


----------



## Valantar (Nov 7, 2022)

AusWolf said:


> Thanks.
> 
> That's what I thought too... that it's okay but a bit hotter than what I'd expect. I know that my case's greatest side isn't airflow, but I don't know...


You could always try removing the front panel and/or opening the side panel to see how the temperatures change? That way you can see if it's the case or if that gpu just has a too conservative fan curve or something.


----------



## AusWolf (Nov 8, 2022)

Valantar said:


> You could always try removing the front panel and/or opening the side panel to see how the temperatures change? That way you can see if it's the case or if that gpu just has a too conservative fan curve or something.


I've done a few things, here's the update. So far, I've:

*Opened the side of the case* - the fans ran a couple percent slower, but no effect on thermals.
*Lowered the power target to its lowest, -6%* - a 12 W reduction in power, but no effect on thermals.
*Tried undervolting* - the card is stable with the 1150 mV option, which strangely, is more like 1 V under normal use. That lowered its power consumption by 15 W, but also made the fans spin faster for some reason. That resulted in cooler operation, but not buy much: about -5 °C max.
*Tried a more aggressive fan curve* - with fans at 100%, the GPU runs at 76-77 °C with a hotspot of 94-95 °C (a 5 °C reduction in GPU temp and a 10 °C reduction in hotspot temp). Not bad, but I'm scared that the neighbours will call the police on me for the noise, not to mention that gaming like this is impossible.

With all this in mind, can we establish that reference design AMD cards run hot and there's nothing to do about it?


----------



## GreiverBlade (Nov 8, 2022)

this is why i only get references if i plan to use either an aftermarket cooler (like a Prolimatech MK-26) or watercooling (like i did with my R9 290 ref. )

custom design are fine, references are well ... hotter indeed (too bad TPU does not have a 6750 ref. review )



i love the Red Devil, i will never say it enough ... (and i love Opera GX Snapshot function  )although it run a smidge hotter than that 6750 XT above 

hilarious how the quiet BIOS is only 0.1dBA less but 1° GPU and hotspot more ... not really worth it, she's already almost dead silent under load using the BIOS 1


----------



## AusWolf (Nov 8, 2022)

GreiverBlade said:


> this is why i only get references if i plan to use either an aftermarket cooler (like a Prolimatech MK-26) or watercooling (like i did with my R9 290 ref. )
> 
> custom design are fine, references are well ... hotter indeed (too bad TPU does not have a 6750 ref. review )
> View attachment 269018
> ...


Lesson learned...  now only two questions remain:
1. unbearable 100% fan and 95 °C hotspot, or standard curve and 102-105 °C hotspot?
2. AMD says hotspot up to 110 °C is fine, but will heat damage the card over time?


----------



## GreiverBlade (Nov 8, 2022)

AusWolf said:


> Lesson learned...  now only two questions remain:
> 1. unbearable 100% fan and 95 °C hotspot, or standard curve and 102-105 °C hotspot?
> 2. AMD says hotspot up to 110 °C is fine, but will heat damage the card over time?


no option to return it and take a custom design for a bit more?

i did 100% fan all the time for a long time with a Asus ROG 8800 Ultra (until i got a the MK-26) and with the R9 290, but at that time i did game with headphones on ... impossible otherwise 
edit: oh, and i have no neighbor when it's not either, week end or school holidays ... (and they are bloody freaking noisy when they're here ... so i doubt they would notice the leaf blower running indoor  )

for Nr. 2 well if they rate it up to 110°C then 105°C is indeed fine, i doubt they would do a ref. design running that hot if it drastically reduced the product life.


----------



## AusWolf (Nov 8, 2022)

GreiverBlade said:


> no option to return it and take a custom design for a bit more?
> 
> i did 100% fan all the time for a long time with a Asus ROG 8800 Ultra (until i got a the MK-26) and with the R9 290, but at that time i did game with headphones on ... impossible otherwise


It cost me about £100 less than what custom designs are selling for, that's why I chose this one. Besides, it's f-ing beautiful (which I honestly didn't consider until I opened the box). 

If AMD is right, and 105 °C on the hotspot is nothing to worry about, then I'm fine... I'm just not sure if they meant that long-term.

I can understand why you swapped a blower on the 8800 Ultra... I just didn't expect this much heat with a three-fan design, even if it's "only" dual slot. 

Edit: The real question here is whether 105 °C is a problem in the card, or a problem in my head.


----------



## GreiverBlade (Nov 8, 2022)

AusWolf said:


> It cost me about £100 less than what custom designs are selling for


ouch, for once i feel blessed i got a custom design for lower than a ref price ... (which is surprising in Switzerland, thank you cryptocrash! )



AusWolf said:


> Besides, it's f-ing beautiful (which I honestly didn't consider until I opened the box).


indeed it is ... i even kept the ref cooler from the R9 290 when i sold the car with the Aquatuning block on it  (not as beautiful but still a nice collection item for me )



AusWolf said:


> Edit: The real question here is whether 105 °C is a problem in the card, or a problem in my head.


well heat damage electronics indeed but if a component is rated "up to" a certain temp, as long as it's not above that limit, it should be fine (like for caps, in a PSU i had electrolyte caps rated 90°C bursting but the polymer caps rated at 105°C were fine running constantly just 5°C lower of max, i re-capped the whole PSU electrolyte caps with equivalent polymer caps, but that one was a bad design ... why use lower end caps rated 10°C lower than the load operating temps ... and i could use that PSU for 5 more years after warranty expiration which was 3yrs)



AusWolf said:


> If AMD is right, and 105 °C on the hotspot is nothing to worry about, then I'm fine... I'm just not sure if they meant that long-term.


well ... long term running 100% load all the time, but you don't run it 100% all the time? right? 



AusWolf said:


> I can understand why you swapped a blower on the 8800 Ultra... I just didn't expect this much heat with a three-fan design, even if it's "only" dual slot.


yep ... furthermore that the blower was a reference design with a ROG sticker slapped on it  ( at the time all AIB board were just references with "cosmetics" on them... just like my 9800 GX2 from Zotac which was also reference copycat with branding ... although i lost the shroud on that one hehe  )


----------



## Valantar (Nov 8, 2022)

AusWolf said:


> With all this in mind, can we establish that reference design AMD cards run hot and there's nothing to do about it?


That sounds like it's the case for this, but not in general - the RX 6800 reference runs pretty cool for a reference card. With the same (ish?) cooler and TDP. Seems to me that the RX 6750XT is just a hotter running die overall - it's definitely pushed quite far in terms of clocks and power for such a small die.

I would definitely go for hotter and quieter, but then I'm generally very sensitive to noise. 100% fans on any GPU would be unbearable to me. But I also wouldn't worry about 105°C hotspot temps. AMD guarantees 110°C (and yes, that's long-term), and silicon can handle temperatures quite a bit above that before taking damage at sensible voltages and current levels. There's safety margin built into that 110°C number too after all. And, of course, the hotspot will most likely move around the die - it's not the same area hitting those temperatures all the time.

It might be worth looking into repasting and increasing mounting pressure on the cooler though, as that's been a repeated complaint about some AMD reference designs - either using some sort of quasi-thermal pad instead of paste, having too low mounting pressure (due to the thickness of these pads/sheets), or both. Nylon washers behind the retention screws would do the trick.


----------



## BatRastard (Nov 8, 2022)

Ajp2022 said:


> Hi,
> 
> Thanks for your response. I don't have dp on my monitor so am using hdmi. I do have a hdmi to dp adaptor but that is not in use. I've changed the input to HDMI 1 and turned off the auto source. Sadly the issue is persisting. I didn't have this with the RX580 and when l switch out to that GPU it goes away.
> 
> It is as if the RX 6600 only kicks in once the OS has reached the login screen.



I just uploaded a video of this very behavior and it explains why this is happening. 









In short: Either AMD or their AIBs hardcoded the GOP driver embedded in the VBIOS to display a specific resolution and refresh rate at POST. If your display can support it, all is great! If not, you're getting "Input Not Supported" until Windows/Linux loads and the Adrenaline drivers kick in. For example, my MSI RX 6600 Mech 2 does this very thing to my poor 1080p Acer display (R241Y) at boot up of my MSI B550 A-Pro. Used to see a little underline cursor in the upper left corner with my GTX 1050 Ti. Not anymore. What I had to do is hook up my Vizio V505-G9 4KTV up to the HDMI and that got me into the BIOS so I could enable everything for Resizable BAR (which requires UEFI Mode), and that's when I noticed a small detail on the Info screen of my 4KTV: the HDMI output of this MSI RX 6600 is hardcoded to drive a 2160p signal at 30hz. Not 720p, not 1080p, not even basics like 800x600. Nope - 4K out of the gate! No way in hell my Acer is displaying that! 

So the only solution (other than AMD or their AIBs releasing a new VBIOS) is to stay in CSM Mode, buy a 4K display with HDMI/DisplayPorts, or try to find an HDMI-to-HDMI (or DP-to-HDMI) converter to downscale the 2160p down to 1080p ... 

That or hooking up the 4KTV every time I need to get into the BIOS ... (I'm not overclocking NOW!  )


----------



## GreiverBlade (Nov 8, 2022)

BatRastard said:


> I just uploaded a video of this very behavior and it explains why this is happening.
> 
> 
> 
> ...


i use a 2560x1440 screen @2880x1620 (technically)

that issue is only with HDMI? because if yes ... another solution is to use DP, my screen is connected via DP and i have zero issues of the sort ... 
nevermind i reread, the one you answer to mention his screen has no DP input ... indeed the only solutions in that case are the one you mentions


----------



## AusWolf (Nov 8, 2022)

Valantar said:


> That sounds like it's the case for this, but not in general - the RX 6800 reference runs pretty cool for a reference card. With the same (ish?) cooler and TDP. Seems to me that the RX 6750XT is just a hotter running die overall - it's definitely pushed quite far in terms of clocks and power for such a small die.


I think the 6800 XT runs cooler because it pushes about the same power through a larger die - that is, the 6750 XT's heat is more concentrated and harder to get rid of. It's nearly 6800 level of performance asked from a smaller die. It's a shame that there isn't much undervolting headroom in it (or in my unit at least).



Valantar said:


> I would definitely go for hotter and quieter, but then I'm generally very sensitive to noise. 100% fans on any GPU would be unbearable to me. But I also wouldn't worry about 105°C hotspot temps. AMD guarantees 110°C (and yes, that's long-term), and silicon can handle temperatures quite a bit above that before taking damage at sensible voltages and current levels. There's safety margin built into that 110°C number too after all. And, of course, the hotspot will most likely move around the die - it's not the same area hitting those temperatures all the time.


Thanks, that's reassuring.  I think I'll go that route myself (I hate fan noise too). If AMD purposefully configured this card to run like this, then it must be safe and the problem is really just in my head.



Valantar said:


> It might be worth looking into repasting and increasing mounting pressure on the cooler though, as that's been a repeated complaint about some AMD reference designs - either using some sort of quasi-thermal pad instead of paste, having too low mounting pressure (due to the thickness of these pads/sheets), or both. Nylon washers behind the retention screws would do the trick.


Repasting isn't really an option on these as far as I know. I've watched GN's teardown of the 6700 XT - AMD uses some thin graphene sheet (or something like that) as TIM for the reference GPUs that theoretically works better than paste.

I also remember doing the washer trick on my ROG Strix 5700 XT back in the days. It slightly improved GPU temps, but made the VRAM hotter as the fans could spin slower and the VRAM didn't get so much airflow. I remember seeing above 100 °C on the VRAM hotspot temps which is quite near the safe limit of GDDR6 (yes, the ROG Strix 5700 XT was a bad construction). At least I haven't seen the VRAM on the 6750 XT exceed 90 °C, yet (touch wood).


----------



## Valantar (Nov 9, 2022)

BatRastard said:


> I just uploaded a video of this very behavior and it explains why this is happening.
> 
> 
> 
> ...


Wow, that's weird. I've used my PowerColor RX 6600 Fighter plenty on a 1200p monitor (an old Dell, through a DP to DVI adapter), but don't think I've tried it on HDMI yet. It's about to move in next to a 1080p TV though, so I can report back after seeing how it responds to booting up to 1080p over HDMI.


----------



## Lost_Troll (Nov 9, 2022)

BatRastard said:


> I just uploaded a video of this very behavior and it explains why this is happening.
> 
> 
> 
> ...



Read this thread: Help! - New Radeon RX 6600 XT problem (PC not booting, freeze before bios) | TechPowerUp Forums

I think your issue is with the UEFI BIOS having a bad setting. It could also be the Acer display has a bad EDID prom or is not written properly to meet the EIA/CEA (CTA) standard. 

Extended Display Identification Data - Wikipedia

On a side note, I used a HP V21 1080 disply to setup my new build with a RX 6650 XT without issue.


----------



## PerfectWave (Nov 9, 2022)

BatRastard said:


> I just uploaded a video of this very behavior and it explains why this is happening.
> 
> 
> 
> ...


you can enter bios from windows from the recovery window. try it and see if it appear on your monitor


----------



## BatRastard (Nov 9, 2022)

Lost_Troll said:


> Read this thread: Help! - New Radeon RX 6600 XT problem (PC not booting, freeze before bios) | TechPowerUp Forums
> 
> I think your issue is with the UEFI BIOS having a bad setting. It could also be the Acer display has a bad EDID prom or is not written properly to meet the EIA/CEA (CTA) standard.
> 
> ...


The OP in that thread neglected to mention one crucial detail: was he in UEFI Mode or Legacy CSM? I believe his rig was in UEFI Mode before he cleared CMOS which put him back in Legacy CSM Mode, and he errantly believed his problem was solved. We're left to speculate if his issue returned upon re-enabling UEFI Mode ... 

The behavior of a GPU at POST is determined by the embedded GOP Driver in the GPU and UEFI/CSM Mode. The GOP Driver can be written to initialize 480p over HDMI in CSM and then 1080p or 2160p over DP in UEFI Mode. It's all up to the AIBs - they modify AMD's firmware to their standards, which is why some of you with the same GPU from another vendor have different experiences. There's is a new menu that appeared in the B550 A-Pro BIOS after installing the RX 6600 called "AMD Driver Health". I clicked it and its empty. I'm going to clear CMOS just in case the UEFI VBIOS of the GTX 1050 left a resolution table for the motherboard UEFI to use and it's preventing the RX 6600 from leaving its own. A CMOS clear and reconfiguration couldn't hurt at this point.



PerfectWave said:


> you can enter bios from windows from the recovery window. try it and see if it appear on your monitor


It didn't.

Okay, it's getting really really weird now ... 

The CMOS clear didn't help and neither did CSM Mode. In fact, tonight's evidence suggests an inexplicable incompatibility between the Acer display and this RX 6600. And I mean inexplicable .... 

First some screenshots ... 


 


From these screenshots, you can pretty much glean the native resolution of the Acer display is 1080p at 75Hz and that's what its using within Windows on both the RX 6600 and the GTX 1050 Ti. So on a whim, I rebooted and blind launched into my BIOS. From there, I unplugged the Acer and plugged the 4KTV back in, and then pulled up the detailed info on the Vizio TV with its own remote ... and it, too, was displaying my BIOS at 1080p at 75Hz instead of at 2160p at 30Hz. 

At this point, I did the CMOS clear the easy way by pressing F6 --> Enter --> F10 --> Enter ... 

Once the system rebooted and I got back into BIOS, I hooked the Acer back up and ALT+CTRL+DEL to reboot .................................. and "Input Not Supported" ... 

Let me restate that again: the RX 6600 is properly detecting and trying to feed this Acer display its own native resolution and refresh rate ... *and it only accepts it in Windows! *

This ridiculous behavior NEVER happened on the GTX 1050 Ti OC ... 

None of this makes any sense ... I'm at a total loss here!


----------



## Valantar (Nov 9, 2022)

Wow, that is some baffling stuff. Can you somehow lock the display to 60Hz through its OSD?


----------



## Lost_Troll (Nov 9, 2022)

You may want to see if there is an updated version of your cards VBIOS, VBIOS Information | MSI Global English Forum - Index

"If you are running stock VBIOS that came with your card, your best bet for update check is by running LiveUpdate / MSI Dragon Center as that will scan the card and provide an update if one is available."

You may also want to open a support ticket with MSI and see what they say.


----------



## BatRastard (Nov 9, 2022)

The OSD is only has a few options related to display such as overdrive and disabling Freesync and some are greyed out. Also, Dragon Center is seriously deprecated and was replaced with MSI Center. It causes more problems than it's worth but I did reinstall it just to see if there was a video BIOS update. No such luck ...

*UPDATE: *Got a spare HP w2006 (sitting on top my Linux box in the video). Hooked a DVI-to-HDMI converter to it and the RX 6600 said, "How you doin?!?"  
Hooked the Acer back up, and it displayed my BIOS screen at the HP's native resolution (1680x1050) until I rebooted. Back to "Input Not Supported". Makes me sick ...


----------



## BatRastard (Nov 12, 2022)

After MSI Support more or less told me to get bent, I broke down and replaced the Acer with ....... an LG UltraGear 32" 1440p Gaming monitor for $212 out the door ...


----------



## GreiverBlade (Nov 12, 2022)

BatRastard said:


> After MSI Support more or less told me to get bent, I broke down and replaced the Acer with ....... an LG UltraGear 32" 1440p Gaming monitor for $212 out the door ...


damn that LG ultragear 32" is 252chf for me that's a whole 47chf cheaper than my budget Medion Erazer 32" 1440p IPS (budget that is not a TN or VA ... is an achievement by itself ... )

does it support higher resolution than 1440p (mine does 2880x1620p aka 3k )


----------



## BatRastard (Nov 12, 2022)

GreiverBlade said:


> damn tha LG ultragear 32" is 252chf for me that's a whole 47chf cheaper than my budget Medion Erazer 32" 1440p IPS
> 
> does it support higher resolution than 1440p (mine does 2880x1620p aka 3k )


Nope just 1440p ... and HDR is pathetic compared to my Vizio. Other than that, not bad. Haven't gamed on it yet. I'm due for a fresh Windows 11 install, actually.


----------



## GreiverBlade (Nov 12, 2022)

BatRastard said:


> Nope just 1440p ... and HDR is pathetic compared to my Vizio. Other than that, not bad. Haven't gamed on it yet. I'm due for a fresh Windows 11 install, actually.


i am surprised by my monitor i have since quite some years now ... still the perfect 1440p60/75 monitor i could find for 300chf~ (normal listing 399, i got a "last piece in stock" discount) 
paired with the 6700 XT i have it's just perfect ... 

just slightly bummed that 75hz is only 1440p, but 1620p60 is more than enough given what i play (plus i can't see improvement/advantages above 100hz, tested )


----------



## BatRastard (Nov 12, 2022)

The refresh rate on this is killer -- 60Hz, 120hz, 144hz, and 165hz. I did a little gaming the other day on the Acer. Got damned good performance out of Control, Immortals Fenyx Rising, and Doom. But I'm concerned about the amount of heat in my FD Focus G with only 2 front intake fans and 1 exhaust fan out the back. Ran TimeSpy yesterday and my 3600 hit 93c ... only hits 71c on Cinebench ...


----------



## Valantar (Nov 12, 2022)

GreiverBlade said:


> damn that LG ultragear 32" is 252chf for me that's a whole 47chf cheaper than my budget Medion Erazer 32" 1440p IPS (budget that is not a TN or VA ... is an achievement by itself ... )
> 
> does it support higher resolution than 1440p (mine does 2880x1620p aka 3k )


What is the native resolution of that panel? I've never heard of a 2880x1620 desktop monitor - so unless it's downscaling a higher input resolution to 1440p, that would be a rather unique product.



BatRastard said:


> Nope just 1440p ... and HDR is pathetic compared to my Vizio. Other than that, not bad. Haven't gamed on it yet. I'm due for a fresh Windows 11 install, actually.


Speaking of good HDR, I just bought an AOC Agon Pro AG274QXM - not _entirely_ sure that I'll keep it yet, but man, that MiniLED backlight makes HDR a joy, and 1440p170 is nothing to scoff at either. Finally my 6900 XT has room to stretch its legs some


----------



## GreiverBlade (Nov 12, 2022)

Valantar said:


> What is the native resolution of that panel? I've never heard of a 2880x1620 desktop monitor - so unless it's downscaling a higher input resolution to 1440p, that would be a rather unique product.


native is 1440p according to monitor specs (although i suspect it's a 1620p native downscaled )
as per manual : max 1440p


i mentioned the over resolution once here :


			https://www.techpowerup.com/forums/threads/general-nonsense.232862/post-4604868
		



			https://www.techpowerup.com/forums/threads/general-nonsense.232862/post-4604879
		


when 2.5k is too little and 4k is too much, 3k FTW


----------



## Valantar (Nov 12, 2022)

GreiverBlade said:


> native is 1440p according to monitor specs (although i suspect it's a 1620p native downscaled )
> as per manual : max 1440p
> View attachment 269695
> 
> ...


If it's native 1440p I'm surprised that downscaled 1620p isn't annoyingly blurry tbh. That's a weird scaling factor, and while Cleartype might be able to handle text rendering well, other sharp edges must look soft still?


----------



## GreiverBlade (Nov 12, 2022)

Valantar said:


> If it's native 1440p I'm surprised that downscaled 1620p isn't annoyingly blurry tbh. That's a weird scaling factor, and while Cleartype might be able to handle text rendering well, other sharp edges must look soft still?


it look fine at 1440p and crispier at 1620p, not sure of what to make about that screen, except that, it's the best screen  i ever owned


----------



## Valantar (Nov 12, 2022)

GreiverBlade said:


> it look fine at 1440p and crispier at 1620p, not sure of what to make about that screen, except that, it's the best screen  i ever owned


That's very weird - supersampling can make things look a bit sharper, but typically doesn't work well for static images, text or desktop apps. I guess it has a good scaler then!


----------



## GreiverBlade (Nov 12, 2022)

Valantar said:


> That's very weird - supersampling can make things look a bit sharper, but typically doesn't work well for static images, text or desktop apps. I guess it has a good scaler then!


i guess so,

well i was surprised by that 299chf IPS 32" heck of a monitor (mostly made out of metal a bit of plastic anything but cheap looking, cast aluminum stand)
the GTX 1070 was barely enough for it at 1440p but the RX 6700 XT is just perfect (and still hold quite fine at 1620p)


----------



## BatRastard (Nov 13, 2022)

I take back what I said about HDR on this LG. When you first enable it, it doesn't pop like you normally expect HDR to stand out. The presets that come with the display doesn't do HDR justice so I had to do a little bit of tweaking and I managed to get it looking as close as possible to my Vizio V505-G9 (which needed very little calibration). 

Also had to uninstall LG's Windows-based OSD software -- craptacular junk crashed and froze constantly ...


----------



## Valantar (Nov 13, 2022)

BatRastard said:


> Nope just 1440p ... and HDR is pathetic compared to my Vizio. Other than that, not bad. Haven't gamed on it yet. I'm due for a fresh Windows 11 install, actually.


Honestly, getting HDR support at all on a $200 monitor is good - and getting actually functional HDR that makes a perceptible difference is very unusual. Monitors are overall_ crap _when it comes to HDR support, while TVs are often very very good. It's a damn shame, really. Hardware Unboxed has an excellent monitor test suite where they test for actual HDR (as in more than "can decode a HDR signal", but whether it can actually display real high dynamic range imagery, has the contrast, color gamut and brightness to support it, and can control its backlight competently to ensure good image quality) - and the vast majority of "HDR" monitors fail miserably. Extremely few monitors have any kind of local dimming, and most that have have either 8 or 16 edge-lit "zones" - making it essentially useless. And most of those also only qualify for HDR400, which really isn't sufficiently bright for real HDR unless you're in a _very_ dark room and the monitor has _very_ good black levels.

It's good to hear that you could get some perceptible improvement through some tuning, but I'd honestly be shocked if it came anywhere near your Vizio TV. I just bought the cheapest "real" HDR monitor I could get my hands on, an AOC AG274QXM, and that definitely wasn't cheap - but it does HDR1000 and has a 576-zone MiniLED backlight, so its HDR is pretty good at least. It's still not as good as a 2020 Samsung Q80 TV though.


----------



## BatRastard (Nov 13, 2022)

What makes this this display ... weird ... is that it's HDR10 but only 350 nits and its a VA instead of an IPS. It usually retails for $260 to $340 and I got it for $200 + $12 (Michigan 6 cent sales tax). I only paid $300 for my 50" Vizio two or three years ago (which is 600 nits if I recall) and it still retails for that price. But I only tried to game on it with my GTX 1050 at 1080p and it was a mess. Games like Gauntlet: Slayer Edition hard locked the computer (but work fine the LG) and Shadowman Remastered refused to use the soundbar and used my 4.1 computer speakers instead. Between the sound, resolution, refresh issues, the GTX 1050 just wouldn't cut it and the last game it did 1080p very well on was Yakuza 0 ...


----------



## Valantar (Nov 13, 2022)

BatRastard said:


> What makes this this display ... weird ... is that it's HDR10 but only 350 nits and its a VA instead of an IPS. It usually retails for $260 to $340 and I got it for $200 + $12 (Michigan 6 cent sales tax). I only paid $300 for my 50" Vizio two or three years ago (which is 600 nits if I recall) and it still retails for that price. But I only tried to game on it with my GTX 1050 at 1080p and it was a mess. Games like Gauntlet: Slayer Edition hard locked the computer (but work fine the LG) and Shadowman Remastered refused to use the soundbar and used my 4.1 computer speakers instead. Between the sound, resolution, refresh issues, the GTX 1050 just wouldn't cut it and the last game it did 1080p very well on was Yakuza 0 ...


That sounds rather weird, yeah, but some TVs just don't cooperate with PC GPUs for all kinds of reasons. Thankfully this seems to be improving as TV gaming with PCs is becoming more common (especially through the pandemic). Does it by any chance have a HDMI input labeled DVI or PC? If so, I'd use that and only that for connecting a PC, as that typically means both better communication and less lag-inducing and image-destroying post-processing applied. But yeah, you also _easily_ get much higher brightness on TVs than monitors, as TVs are more commonly used in sunny areas where that brightness is needed. And the larger size makes it easy to stick a much more powerful backlight in there.

As for that monitor, I'd say it's good that it's a VA, as that way even with no local dimming the native contrast gets you much closer to baseline HDR requirements - specifically, 14 stops of brightness (1 'stop' = a doubling of light). 1000:1 contrast is ~11 stops (1024:1 would hit that exactly), so 4000:1 is 13 stops, landing most VA panels at a native dynamic range of 12-13 stops - which is _far_ superior to the 11 stops of a non-locally dimmed IPS. It's still not actually HDR, but it can get a lot closer. Still, with that 350 nit peak brightness you'd need a pretty dark room to really get the most out of that, as ambient light will inevitably wash out very dark blacks and/or overpower the highlights otherwise.


----------



## BatRastard (Nov 14, 2022)

My Vizio has 3 HDMI (2 of which support HDR), 1 Composite Input (), Analog Out & Optical Out ... 

The LAST thing I wanna see is 50 inches of composite video on that display. Even if I had a Component To HDMI converter for my PS2, it's gonna be bowling shoe ugly. 720p is the lowest resolution worth watching on it.


----------



## Kabouter Plop (Nov 15, 2022)

driver inbound ?






Hopefully some good release notes this time since discovery of MPO breaking many things.
Seems firefox also hates it when other apps run vsync and then causes freesync to break while forcing vsync off makes freesync work really well especially with MPO off.
edit: another disapointing driver not recognizing MPO related issues pretending the issue does not exist does not make it go away AMD.

AMD is not your friend, even when many report disabling MPO fixes issues they won't acknowledge it meanwhile Nvidia however.





						After updating to NVIDIA Game Ready Driver 461.09 or newer, some desktop apps may flicker or stutter when resizing the window on some PC configurations | NVIDIA
					






					nvidia.custhelp.com
				



I won't feel any pitty for AMD when Nvidia destroys their market share even further.


----------



## Kabouter Plop (Nov 17, 2022)

Some one discovered vsync is broken even if vsync is left by aplication control, forcing vsync off globally fixes everything even MPO flickering within steam i cant believe AMD has't figured this out yet, it seems vsync is engaging 2 times by the looks of it giving 2 times more increased latency as result, and this is leading to MPO to crap out and freeze which as result blacks out.


----------



## MrHartmann (Nov 17, 2022)

I bought a used Asus Tuf RX 6900 XT OC, very well built board, I'm just bothered by the temperature, it's getting to 82c/102c, my case is relatively well ventilated and it hasn't even started summer here where I live.. Anyone else with this model to give feedback on it?


----------



## AusWolf (Nov 17, 2022)

MrHartmann said:


> I bought a used Asus Tuf RX 6900 XT OC, very well built board, I'm just bothered by the temperature, it's getting to 82c/102c, my case is relatively well ventilated and it hasn't even started summer here where I live.. Anyone else with this model to give feedback on it?


That's exactly the temperature I see on my reference 6750 XT. According to AMD, hotspot temp up to 110 °C is fine. Have you run any stability test on it to see if your performance is consistent across different temperature ranges?



Kabouter Plop said:


> Some one discovered vsync is broken even if vsync is left by aplication control, forcing vsync off globally fixes everything even MPO flickering within steam i cant believe AMD has't figured this out yet, it seems vsync is engaging 2 times by the looks of it giving 2 times more increased latency as result, and this is leading to MPO to crap out and freeze which as result blacks out.


I've been using Enhanced sync with no problems ever since its debut a couple years ago.


----------



## Kabouter Plop (Nov 17, 2022)

AusWolf said:


> That's exactly the temperature I see on my reference 6750 XT. According to AMD, hotspot temp up to 110 °C is fine. Have you run any stability test on it to see if your performance is consistent across different temperature ranges?
> 
> 
> I've been using Enhanced sync with no problems ever since its debut a couple years ago.



Enchanced sync cause minor freezes when alt tabbing its still broken, altho it probably happens more at higher resolutions, remember couple of drivers ago they claimed to have fixed blackscreens as well but people experiencing blackscreens now since least 22.6.1 and probably 22.5.2 as well, they probably trying to make it Microsoft problem so they dont have to fix it expecting Microsoft to fix it.


----------



## MrHartmann (Nov 17, 2022)

AusWolf said:


> That's exactly the temperature I see on my reference 6750 XT. According to AMD, hotspot temp up to 110 °C is fine. Have you run any stability test on it to see if your performance is consistent across different temperature ranges?


I ran a few runs of Timespy and Firestrike, about 15min of Cyberpunk and 1h of Metro Exodus, after about 10min it reached that temperature and stabilized.. I was expecting lows 70c, due to the size of its cooler. 

I came from a Rog RX 6800 (non XT) and Rtx 3090 Waterforce, both very cold cards.. 80c/100c scares me...


----------



## GamerGuy (Nov 17, 2022)

MrHartmann said:


> I bought a used Asus Tuf RX 6900 XT OC, very well built board, I'm just bothered by the temperature, it's getting to 82c/102c, my case is relatively well ventilated and it hasn't even started summer here where I live.. Anyone else with this model to give feedback on it?


 I have the Sapphire Nitro+ RX 6900XT and when I leave it at stock fan setting, temps do get pretty high, I dislike the 'Zero' fan setting, so I'd go into Adrenalin CP and create a custom, and aggressive I might add, fan curve. Now, with Ambient temp at 23-24C, the GPU shows around 70C with Hotspot temps not exceeding 90C.


----------



## MrHartmann (Nov 17, 2022)

GamerGuy said:


> I have the Sapphire Nitro+ RX 6900XT and when I leave it at stock fan setting, temps do get pretty high, I dislike the 'Zero' fan setting, so I'd go into Adrenalin CP and create a custom, and aggressive I might add, fan curve. Now, with Ambient temp at 23-24C, the GPU shows around 70C with Hotspot temps not exceeding 90C.


On mine I needed to lower the maximum speed of the stock curve, it reached 100% after 72c, something like that, which was very noisy, I left it to 80% above 75c...


----------



## Nordic (Nov 18, 2022)

I am considering an upgrade from my 1060 3gb to a 6750xt for $400. Nearly 3x the performance for $400 seems decent.


----------



## AusWolf (Nov 18, 2022)

Nordic said:


> I am considering an upgrade from my 1060 3gb to a 6750xt for $400. Nearly 3x the performance for $400 seems decent.


Which model are you looking at? I have the AMD reference which is beautiful, but runs hot as hell (not that it's an issue, though).


----------



## Terronium-12 (Nov 19, 2022)

*Upgraded from a 2080.






Already had my hand in overclocking it, but I'm still messing about with voltages and the like. I don't know why, but the 319 cards (6900, in particular) just...don't exist to any of the review websites I read or tech channels I frequent.*


----------



## Krajlevic (Nov 19, 2022)

Hi everyone
I'm enjoying my new Sapphire Rx 6650 XT pulse since monday. She performs very well in 4k even with Ray tracing in Gotham Knights for example.
But I've noticed something that didn't happenned with my old RX 580 Nitro+ which was plugged with a 6 pins and a 8 pins.
The 8 pins was provided through my power supply with molex adaptator to 8 pins, no problems here, 175w of consumption at 100% gpu utilization.

Now the 6650xt, got the juice only with this 8 pins adaptator and weirdly the sensors from Adrenalin software didn't say the value I could have watched in many reviews.

Through GPU-Z and Adrenalin my consumption is max 143W and more, my GPU utilization doesn't reach 100%, only 99%. All games and benchmarks run smoothly, I've coherent results in 3dMark comparing at what I see in reviews or tests. Only, that strange feeling that maybe my PSU must be upgraded, it's a 550W (Antec Neo He) but with no native 8 pins wire. I add, that Radeon chill or power management are off and I've tried the auto overclok from Adrenalin and I've got a boost till 155w of consumption.

Did someone have a 6650XT and can share his values ? I plan to change my PSU but not immediately.


----------



## GreiverBlade (Nov 19, 2022)

22.11.1 still no alt-tab, MPO or VSYNC issues (if set on application preference, it's set by the application correctly), zero blackscreen (2880x1620p DP as usual, 1920x1080p secondary screen on HDMI for test )

Win 10 21H2 still ~3 years till forced Win 11  i have marging

they may try to make it a M$ issue, but it seems it is ... (at least from what i see it mostly happens with Win 11 )



Krajlevic said:


> Hi everyone
> I'm enjoying my new Sapphire Rx 6650 XT pulse since monday. She performs very well in 4k even with Ray tracing in Gotham Knights for example.
> But I've noticed something that didn't happenned with my old RX 580 Nitro+ which was plugged with a 6 pins and a 8 pins.
> The 8 pins was provided through my power supply with molex adaptator to 8 pins, no problems here, 175w of consumption at 100% gpu utilization.
> ...


mmhhh 99% and lower consumption but all run smooth and 3dMark is coherent? i see no issues here


----------



## AusWolf (Nov 19, 2022)

Krajlevic said:


> Hi everyone
> I'm enjoying my new Sapphire Rx 6650 XT pulse since monday. She performs very well in 4k even with Ray tracing in Gotham Knights for example.
> But I've noticed something that didn't happenned with my old RX 580 Nitro+ which was plugged with a 6 pins and a 8 pins.
> The 8 pins was provided through my power supply with molex adaptator to 8 pins, no problems here, 175w of consumption at 100% gpu utilization.
> ...


99% usage is normal. The 143 W you see in GPU-Z is GPU chip only power consumption. I'm not sure if your 6650 XT measures total board power consumption. My reference 6750 XT doesn't.


----------



## Krajlevic (Nov 19, 2022)

Thank you all.
I was wondering if the molex adaptator wasn't reliable comparing to a native 8 pins cable. I read also that Adrenalin software wasn't accurate for the power drawn et minimize.
Thanks for reassuring me !


----------



## AusWolf (Nov 19, 2022)

Krajlevic said:


> Thank you all.
> I was wondering if the molex adaptator wasn't reliable comparing to a native 8 pins cable. I read also that Adrenalin software wasn't accurate for the power drawn et minimize.
> Thanks for reassuring me !


Personally, I wouldn't rely on a Molex adapter long-term, but it'll probably do until you change your PSU to a unit with proper 8-pin PCI-e cables - which should be your next upgrade, imo.


----------



## Krajlevic (Nov 19, 2022)

AusWolf said:


> Personally, I wouldn't rely on a Molex adapter long-term, but it'll probably do until you change your PSU to a unit with proper 8-pin PCI-e cables - which should be your next upgrade, imo.


That's what I have in mind ! Thanks again


----------



## BatRastard (Nov 19, 2022)

Well, that's weird ... but I won't complain! Just replaced my Ryzen 5 3600 with a 5600. Cinebench would fricassee the 3600 to a maximum of 71 degrees. It had been idling at the mid 40s and spiking to the low to mid 60s for last few weeks. But the 5600 topped out at just 55 degrees after 1 hour of Cinebench. These improved thermals put the kibosh on the RX 6600 randomly ramping up and down while surfing the internet or watching YouTube videos ...


----------



## AusWolf (Nov 19, 2022)

BatRastard said:


> Well, that's weird ... but I won't complain! Just replaced my Ryzen 5 3600 with a 5600. Cinebench would fricassee the 3600 to a maximum of 71 degrees. It had been idling at the mid 40s and spiking to the low to mid 60s for last few weeks. But the 5600 topped out at just 55 degrees after 1 hour of Cinebench. These improved thermals put the kibosh on the RX 6600 randomly ramping up and down while surfing the internet or watching YouTube videos ...
> 
> View attachment 270742


Congrats on your new CPU, although it has little to do with your GPU ramping up and down. It does that because youtube and your web browser use its hardware acceleration feature. Although I wonder, do you mean its fan is ramping up and down? How hot is the GPU core while you're watching youtube? There must be something with your case ventilation.


----------



## Terronium-12 (Nov 20, 2022)

*Ran Superposition several times, ran the FH5 bench several times, and let Heaven loop for a solid 20+ minutes, and this is seemingly stable at 2775/2100 at 1.075v. 




*


----------



## BatRastard (Nov 20, 2022)

AusWolf said:


> Congrats on your new CPU, although it has little to do with your GPU ramping up and down. It does that because youtube and your web browser use its hardware acceleration feature. Although I wonder, do you mean its fan is ramping up and down? How hot is the GPU core while you're watching youtube? There must be something with your case ventilation.


Mid 40s or so. But I think MSI Afterburner was the culprit after all. I had set a manual fan curve with the 1050 and forgot to set it back to Auto when I installed the RX 6600 ...


----------



## Terronium-12 (Nov 21, 2022)

*Question: I've now clocked the 6900 to 2800/2100 and was at 1.075v, then bumped it up to 1.09 just to be safe. Then I noticed the voltage was still bumping up to 1.175, so does Adrenaline ignore the set voltage if the set frequency requires more to be stable?

I've set it back to max (1.175) for the time being to be even safer.*


----------



## AusWolf (Nov 21, 2022)

Terronium-12 said:


> *Question: I've now clocked the 6900 to 2800/2100 and was at 1.075v, then bumped it up to 1.09 just to be safe. Then I noticed the voltage was still bumping up to 1.175, so does Adrenaline ignore the set voltage if the set frequency requires more to be stable?
> 
> I've set it back to max (1.175) for the time being to be even safer.*


What I've noticed is that the voltage slider, even though has fixed values, is more like an offset kind of thing, an approximation.


----------



## 3x0 (Nov 21, 2022)

The only way to cap the voltage is via MorePowerTool


----------



## Nordic (Nov 23, 2022)

Nordic said:


> I am considering an upgrade from my 1060 3gb to a 6750xt for $400. Nearly 3x the performance for $400 seems decent.


I found an even better deal with a 6750 xt for $390 after rebate. I put the order in. This will be my first AMD GPU since my beloved 7970. Is there anything I should know about the Ryzen 6000 series?


My main rig has had the following in the last decade.
AMD Radeon HD 6950
AMD Radeon HD 7970
NV GTX 970
NV 1060 3gb (sidegrade)
AMD RX 6750 xt.


----------



## gulkor (Nov 23, 2022)

Hi Everyone, Just joined forums and rx 6000 series club!

Bought on Amazon for $510 XFX Speedster QICK319 AMD Radeon RX 6800 Core Gaming Graphics Card with 16GB GDDR6 HDMI 3xDP, AMD RDNA 2 RX-68XLALFD9

I would like do WC on GPU, is it possible? Should i return Gpu should i need to, what card would allow me WC and be same perf? 

GPU-Z says cant find video card in data base. Checking Aida64 shows as Video Adapter XFX RX 6800 Merc319 Black.

Never had this happen to me until now.

Thank you!


----------



## AusWolf (Nov 23, 2022)

Nordic said:


> I found an even better deal with a 6750 xt for $390 after rebate. I put the order in. This will be my first AMD GPU since my beloved 7970. Is there anything I should know about the Ryzen 6000 series?
> 
> 
> My main rig has had the following in the last decade.
> ...


Congrats! That's a really awesome deal! 

What brand is it? If it's the reference model, what you need to know is that it's hot. But it's hot by design, so as long as you have good case airflow, you don't need to worry about it. But you do need good case airflow. Also, it's pretty much an overclocked and overvolted 6700 XT, so don't expect to push it a lot further without hitting thermal boundaries.

Other than that, RDNA 2 is a pretty sound series.


----------



## Nordic (Nov 24, 2022)

AusWolf said:


> Congrats! That's a really awesome deal!
> 
> What brand is it? If it's the reference model, what you need to know is that it's hot. But it's hot by design, so as long as you have good case airflow, you don't need to worry about it. But you do need good case airflow. Also, it's pretty much an overclocked and overvolted 6700 XT, so don't expect to push it a lot further without hitting thermal boundaries.
> 
> Other than that, RDNA 2 is a pretty sound series.


Today's GPU's are a mouthful, but it is the MSI RX 6750 XT MECH 2X 12G OC. It looks like it has dropped another $20 since yesterday.


----------



## BatRastard (Nov 24, 2022)

The MSI RX 6600 Mech 2 I bought about two or three weeks ago for $239?!? It just dropped to $189 ...


----------



## AusWolf (Nov 24, 2022)

Nordic said:


> Today's GPU's are a mouthful, but it is the MSI RX 6750 XT MECH 2X 12G OC. It looks like it has dropped another $20 since yesterday.


MSi's Mech is kind of their "bare minimum" series, just like the reference design. It'll work, but don't expect to overclock it to the Moon.



BatRastard said:


> The MSI RX 6600 Mech 2 I bought about two or three weeks ago for $239?!? It just dropped to $189 ...


I know that feeling. AMD started bundling The Callisto Project just a week after I bought my 6750 XT.


----------



## Kabouter Plop (Nov 24, 2022)

I think something is failing in my system getting system freezes and gpu driver crashes while playing world of warcraft, couple of months ago if had problems with memory but i fixed it and im unable to find errors so far which is frustating getting new memory kit later today altho bit slower im worried my psu may be faulty
im seeing it switch from 11.980v to 11.884v every couple of seconds and sometimes dip to 11.788v in rare case and peek at 12.076v
Perhaps my memory is fine but ram slot is having issues.

my 4e stick bassicly had 3 pins that where bit smudged dirty and not as glossy at that time couple of months ago that stick refused to post but after i cleaned it up it took it again no problem, if tested multiple times overnight without errors yet i get system freezes while playing wow or a tdr mostly a system freeze which very frustating, running furmark for 30 minutes is no problem at all no problems so it can't be my gpu right ?

occt 3d test also passes without errors as well as vram test, this is confusing surely system freezes are caused by hardware problem right even memory won't error.


----------



## AusWolf (Nov 24, 2022)

Kabouter Plop said:


> I think something is failing in my system getting system freezes and gpu driver crashes while playing world of warcraft, couple of months ago if had problems with memory but i fixed it and im unable to find errors so far which is frustating getting new memory kit later today altho bit slower im worried my psu may be faulty
> im seeing it switch from 11.980v to 11.884v every couple of seconds and sometimes dip to 11.788v in rare case and peek at 12.076v
> Perhaps my memory is fine but ram slot is having issues.
> 
> my 4e stick bassicly had 3 pins that where bit smudged dirty and not as glossy at that time couple of months ago that stick refused to post but after i cleaned it up it took it again no problem, if tested multiple times overnight without errors yet i get system freezes while playing wow or a tdr mostly a system freeze which very frustating, running furmark for 30 minutes is no problem at all no problems so it can't be my gpu right ?


11.78 V is a bit low, but still within the ATX spec. I would swap that PSU anyway, just to be safe.


----------



## Kabouter Plop (Nov 24, 2022)

AusWolf said:


> 11.78 V is a bit low, but still within the ATX spec. I would swap that PSU anyway, just to be safe.



I thought so to appearntly 11.4 is also within spec some reviews mention 11.788 as well its rarely dropping that much tho if disabled c states now see if it has any impact as well as soc oc mode enabled, im starting to think that all my problems with freezes are chipset driver related cos they been fixing some issues with them as well or perhaps my board is failing
I think the odds of my psu failing is quite low but i still have my evga 1300 g2 laying around i could plug in the bare minimum and test with that psu but not looking forward to it.

Got my new memory kit now hynix instead of samsung b die timings are awful lol 18 22 22 42 64 i just wanna confirm it gets rid of the system freezes if it does im rma'ng my memory kit even tho i get no errors obviously its to blame if this fixes it, it has failed posting before on 1 stick that i fixed.


----------



## BatRastard (Nov 24, 2022)

I ran a PSU with a +12v rail that once it dropped to 11.44, it was all downhill from there. I knew age, wear, and tear would degrade it farther. I replaced with a Corsair TX650 in 2008 and it's still going. The fan died, but that PSU feeds my Linux box today. I'd replace yours ASAP. Sure, 11.44 is still in spec, but it's probably dealing with dirty power (depending on the wiring).


----------



## Kabouter Plop (Nov 24, 2022)

Just froze with new memory kit either my gpu is dying or motherboard or psu which is only a year old with 10 year warranty, running with just 1 stick now cos the memory bank that i had a stick fail in may be faulty as well so im running only 16 gb in a2 now to see how that effects system freezes last thing i can try is replace pci-e cable of my mainboard and gpu and reseat mainboard cable
Seems to happen more often on agesa 1203c which lets set edc at 190 when im capped to 140 seems to happen less often

If not mentioned this yet but only have freezes in wow never in other games altho that matters nothing yet since i only played wow lately, im testing with heaven running in background right now see if i can reproduce freezes in heaven and then test other 3d loads as well preferably those that i can leave running in background while not overloading gpu as crazy like furmark would but enough load to make watts go up and down like normal game loads would.

If put my only stick now in the slot as well i suspect to be faulty while testing in hopes i can trigger it faster.


----------



## gulkor (Nov 24, 2022)

Kabouter Plop said:


> Just froze with new memory kit either my gpu is dying or motherboard or psu which is only a year old with 10 year warranty, running with just 1 stick now cos the memory bank that i had a stick fail in may be faulty as well so im running only 16 gb in a2 now to see how that effects system freezes last thing i can try is replace pci-e cable of my mainboard and gpu and reseat mainboard cable
> Seems to happen more often on agesa 1203c which lets set edc at 190 when im capped to 140 seems to happen less often
> 
> If not mentioned this yet but only have freezes in wow never in other games altho that matters nothing yet since i only played wow lately, im testing with heaven running in background right now see if i can reproduce freezes in heaven and then test other 3d loads as well preferably those that i can leave running in background while not overloading gpu as crazy like furmark would but enough load to make watts go up and down like normal game loads would.
> ...


Run memtest test your memory. 









						MemTest86 - Official Site of the x86 Memory Testing Tool
					

MemTest86 is the original self booting memory testing software for x86 and ARM computers. Supporting both BIOS and UEFI, with options to boot from USB.



					www.memtest86.com


----------



## Kabouter Plop (Nov 24, 2022)

gulkor said:


> Run memtest test your memory.
> 
> 
> 
> ...



Not even gonna bother since i had system freezes with brand new kit, im testing different loads now to confirm it it exclusively happens in wow.
Normally i would tho, if i start having freezes in other 3d loads or tdr's im gonna test my spare PSU that is 1300w


----------



## gulkor (Nov 24, 2022)

Kabouter Plop said:


> Not even gonna bother since i had system freezes with brand new kit, im testing different loads now to confirm it it exclusively happens in wow.
> Normally i would tho, if i start having freezes in other 3d loads or tdr's im gonna test my spare PSU that is 1300w


I found this on blizzard forum. https://us.forums.blizzard.com/en/wow/t/extreme-freezing-and-performance-issues-in-sepulcher/1197851


I would try uninstall your GFX drivers, then run safe mode and run DDU, then reinstall fresh GFX drivers,

Some members on wow forum said it helped freezing  and locking up,

wow forum member said

Update from my end:

Cleaning my case (to improve GPU temps) and moving the WoW install to the C drive didn’t resolve the issue. My GPU temps did improve by a degree or two, but it’s still in the 75-80 range, which should be perfectly acceptable for an Nvidia card.

So, I tried to remove the graphics driver again, but this time I used the DDU (Display Driver Uninstaller) tool to fully remove the driver and reinstall the latest from Nvidia. Much to my surprise…so far this has worked. I’ve been able to play for longer periods without a freeze thus far. I did dailies last night and this morning along with a couple bosses in LFR as test cases. Tonight I have raid with my guild, so we’ll see if it holds. If it does, then the issue is either resolved or at least back to a manageable level.

To comment on Karsh’s experience. Removing my addons, uninstalling and reinstalling, and moving the install were all unsuccessful for me, and in fact made my issue much worse to the point where I could no longer log in to a character without forcing DirectX 11. Also, after looking at your dxdiag Karsh, we both have a GeForce GTX 1070 card, which could be a common element to the issue, possibly.


----------



## Nordic (Nov 24, 2022)

AusWolf said:


> MSi's Mech is kind of their "bare minimum" series, just like the reference design. It'll work, but don't expect to overclock it to the Moon.


I bought it for the performance per dollar value. It's not like a 6750 xt was ever going to set any overclocking records.


----------



## AusWolf (Nov 24, 2022)

Nordic said:


> I bought it for the performance per dollar value. It's not like a 6750 xt was ever going to set any overclocking records.


Agreed. Just like my AMD reference card. At 100 quid cheaper than AIB ones, I'm really not complaining.


----------



## Kabouter Plop (Nov 24, 2022)

I rather not buy from a vendor that does shady stuff like void warranty if remove stickers on their coolers even tho illigal in some places in the world, it just proofs they seek a way out of warranty at any cost possible even for something as simple as removing a sticker.


----------



## HD64G (Nov 26, 2022)

Glad to have joined this club as I got the Sapphire Pulse RX6750 XT for €500 a few days ago and just yesterday installed it. Just testing a few undervolting profiles as my 144Hz 1080P monitor (which now works @144Hz with VRAM downclocking properly opposed to my RX5700 not doing so) doesn't need so much graphics power for most games. The autoundervolting setting gives 2,7GHz that work stable at 1,175V instead of the default 1,2V. I already tested and almost surely the 2GHz@1V profile is stable (80% of the performance in 3DMark Timespy for 90W less power draw - 13600 to 11000 points in Timespy) and will test a few more. Any suggestion for a good such profile from an RX6750XT or RX6700XT owner?


----------



## Krajlevic (Nov 26, 2022)

AusWolf said:


> Personally, I wouldn't rely on a Molex adapter long-term, but it'll probably do until you change your PSU to a unit with proper 8-pin PCI-e cables - which should be your next upgrade, imo.


After checking deeply, the BIOS of the card mentionned that it's a 143W as you can see below so nothing is strange...
And I've changed my cable for a proper PCI/E 8 pins


----------



## AusWolf (Nov 26, 2022)

HD64G said:


> Glad to have joined this club as I got the Sapphire Pulse RX6750 XT for €500 a few days ago and just yesterday installed it. Just testing a few undervolting profiles as my 144Hz 1080P monitor (which now works @144Hz with VRAM downclocking properly opposed to my RX5700 not doing so) doesn't need so much graphics power for most games. The autoundervolting setting gives that the 2,62GHz works stabe at 1,175V instead of the defeult 1,2V. I already tested and almost surely the 2GHz@1V profile is stable (80% of the performance in 3DMark Timespy for 90W less power draw - 13600 to 11000 points in Timespy) and will test a few more. Any suggestion for a good such profile from an RX6750XT or RX6700XT owner?


Congrats and welcome! 

Personally, I don't play with profiles - I just set a 60 FPS limit and call it a day. This may or may not be a useful tip depending on what display you have.


----------



## HD64G (Nov 26, 2022)

I just tested some tunes and observed that while I set the GPU voltage at *1V in the AMD suite, the GPUz reported 0.825V*. And for another tune the *AMD suite was adjusted to give 1.175V and the GPUz reported 1.043V*. Any info on that deviation @W1zzard ?


----------



## outpt (Nov 26, 2022)

Does anyone know how to turn down the temperature in msi or amd suites
For a 6800 non X


----------



## HD64G (Nov 26, 2022)

outpt said:


> Does anyone know how to turn down the temperature in msi or amd suites
> For a 6800 non X


In the AMD driver you can just set the fans to speed up sooner and keep temps lower. There is a curve with points you can move.


----------



## AusWolf (Nov 26, 2022)

HD64G said:


> I just tested some tunes and observed that while I set the GPU voltage at *1V in the AMD suite, the GPUz reported 0.825V*. And for another tune the *AMD suite was adjusted to give 1.175V and the GPUz reported 1.043V*. Any info on that deviation @W1zzard ?


As far as I've seen, even though the voltage bar has set values, the actual voltage you see in the end isn't a set value. It's more of a curve offset kind of thing - same with the frequency bar.


----------



## HD64G (Nov 26, 2022)

Tested another tune with 2,3GHz@1,125V in the driver suite and resulted in 2,28GHz@0,87V. Great efficiency with max 110W in power draw. I can just suppose that the firmware has algoriths and the die auto-adjust the voltage needed to keep those clocks no matter what max limit is set. I have no problem to that, instead I find it innovative and very efficient without losing any considerable performance (~90% of stock performance for just above half the power).


----------



## AusWolf (Nov 26, 2022)

HD64G said:


> Tested another tune with 2,3GHz@1,125V in the driver suite and resulted in 2,28GHz@0,87V. Great efficiency with max 110W in power draw. I can just suppose that the firmware has algoriths and the die auto-adjust the voltage needed to keep those clocks no matter what max limit is set. I have no problem to that, instead I find it innovative and very efficient without losing any considerable performance (~90% of stock performance for just above half the power).


If you consider the fact that the 6750 XT is nothing more than an overclocked and overvolted version of the 6700 XT, there must be a crapton of headroom for making it more efficient.


----------



## mechtech (Nov 27, 2022)

Just got an asus 6600 for $270 CAD w/oi tax, probably put it away for a long while for son's BD.  Never know what market is going to be like 7 months from now.  But I am guessing all 6600's sold out and 7600/xt not out yet, and the cheapest card double what I just paid.


----------



## W1zzard (Nov 27, 2022)

HD64G said:


> I just tested some tunes and observed that while I set the GPU voltage at *1V in the AMD suite, the GPUz reported 0.825V*. And for another tune the *AMD suite was adjusted to give 1.175V and the GPUz reported 1.043V*. Any info on that deviation @W1zzard ?


From what I understand this voltage is the maximum, the GPU will choose the actual clock and voltage by itself, seemingly random from time to time


----------



## bobalazs (Nov 27, 2022)

Where would be the sweetspot of 6700 XT voltage / Mhz / watt / fps wise ?
I saw the last 50 watts were squeezed out of the card to keep up with the competition, therefore losing a lot on the efficiency.


----------



## HD64G (Nov 27, 2022)

W1zzard said:


> From what I understand this voltage is the maximum, the GPU will choose the actual clock and voltage by itself, seemingly random from time to time





bobalazs said:


> Where would be the sweetspot of 6700 XT voltage / Mhz / watt / fps wise ? I saw the last 50 watts were squeezed out of the card to keep up with the competition, therefore losing a lot on the efficiency.


In my RX6750XT's case you can get ~97% of the max performance with tuning max clocks to 2,6GHz and voltage to 1,175V which results in 1,05V max and limits the power draw from 210W to ~170W (the power figure that AMD reveals through the sensors). 

This 1,05V is an absolute max limit the goes only lower depending on the load. In another tune I tested and which retains close to 90% of the max performance, the 1,125V for 2,3GHz in the driver suite resulted in a 0,97V max limit that lowered the power draw to 110W max.

Maybe the 6nm helped in efficiency but I suspect that most RX6700XTs will be close to that.


----------



## AusWolf (Nov 27, 2022)

bobalazs said:


> Where would be the sweetspot of 6700 XT voltage / Mhz / watt / fps wise ?
> I saw the last 50 watts were squeezed out of the card to keep up with the competition, therefore losing a lot on the efficiency.


I think that depends on your expectations. Personally, I just set a 60 FPS limit, and call it a day, as the card will draw as much power as it needs to achieve that.


----------



## bobalazs (Nov 27, 2022)

HD64G said:


> In my RX6750XT's case you can get ~97% of the max performance with tuning max clocks to 2,6GHz and voltage to 1,175V which results in 1,05V max and limits the power draw from 210W to ~170W (the power figure that AMD reveals through the sensors).
> 
> This 1,05V is an absolute max limit the goes only lower depending on the load. In another tune I tested and which retains close to 90% of the max performance, the 1,125V for 2,3GHz in the driver suite resulted in a 0,97V max limit that lowered the power draw to 110W max.
> 
> Maybe the 6nm helped in efficiency but I suspect that most RX6700XTs will be close to that.





AusWolf said:


> I think that depends on your expectations. Personally, I just set a 60 FPS limit, and call it a day, as the card will draw as much power as it needs to achieve that.



My hardware setup is : https://valid.x86.fr/vawy9v
6500+6700 xt reference
1080p 165 hz mon + 4k 60 Hz tv

I used Radeon software and Rivatuner Statistics Server to monitor and  a power meter connected to the wall.

I have set a Voltage Vmin/Vmax (mV) value of
793 - 1031
and a Power Limit (W) of
147 GPU
and temp target 65 C
in MorePowerTool.
what results is

*in Cyberpunk

1031 mV 
120 W
91-98 FPS on 1080p Ultra


in Forza Horizon 5
extreme preset

1031 mV 
114W 
126 FPS  



The Division 2 1080p Ultra 

Voltage 1010 - 1030 mV
2297mhz 
135w
1080p ultra 93 fps *

Division 2 pulled the most power - pulled 252 W from wall socket.

Temps are really low on the reference about 62-3 C°.
Pretty content with that.

I could probably go up to 186W with the stock reference setting in MorePowerTool but there is no point. Extra 10% performance for 80W more power from wall.
temps would be higher by at least 10C, as reference card isn't all that great.

To me it looks like best efficiency is under 100 watts of power.
I also came upon a twitter post showing watt per performance on a chart:
twitterlink


----------



## Lagin (Nov 27, 2022)

Ryzen 7 5800X, RX6700XT, 32 gb ram, 980 pro 1tb ssd X2, 570 tomahawk with  Samsung Odyssey G7 27" I built it for 2K gaming, Haven't touched anything, running stock everything. Anything I should do or not do?


----------



## HD64G (Nov 27, 2022)

Lagin said:


> Ryzen 7 5800X, RX6700XT, 32 gb ram, 980 pro 1tb ssd X2, 570 tomahawk with  Samsung Odyssey G7 27" I built it for 2K gaming, Haven't touched anything, running stock everything. Anything I should do or not do?


You can lower power draw, temps and noise by limiting voltage and clocks in the Radeon software suite. Test 2300MHz@1,125V. If stable that would lower power draw by ~70W and temps by 15C. Also you would keep the 90% of the stock performance.


----------



## Lagin (Nov 27, 2022)

Is 15184 multithread cinebench score good? Gaming only.


----------



## AusWolf (Nov 27, 2022)

Lagin said:


> Is 15184 multithread cinebench score good? Gaming only.


Cinebench has nothing to do with gaming, or Radeon 6000-series. You're in the wrong thread.


----------



## Lagin (Nov 27, 2022)

AusWolf said:


> Cinebench has nothing to do with gaming, or Radeon 6000-series. You're in the wrong thread.


ty, I was reading as I posted. vs reading and asking questions, lol


----------



## Nordic (Nov 30, 2022)

Got my 6750 xt in today. I didn't realize it had such a tall cooler. The fans are 100mm. It almost looks like I could modify the shroud and install 120mm fans. That is probably not necessary.


----------



## HD64G (Nov 30, 2022)

After more testing the 1.125v that translates into 0.87v realtime for 2,3GHz proved to be unstable in Troy Total War. So, I increased the voltage in driver to 1.15v and that ended being stable and resulted into 0.9v when pushed.


----------



## Nordic (Dec 1, 2022)

I am trying to enable Resizable BAR. I can't enable Resizable BAR unless I have CSM disabled. I can't boot into Windows from my NVME drive unless I have CSM enabled. Is there a way to resolve this?


----------



## HD64G (Dec 1, 2022)

Nordic said:


> I am trying to enable Resizable BAR. I can't enable Resizable BAR unless I have CSM disabled. I can't boot into Windows from my NVME drive unless I have CSM enabled. Is there a way to resolve this?


Me thinks you need to turn your OS partition into a GPT one first. Look there: https://www.diskpart.com/gpt-mbr/gpt-format-windows-10-0725.html


----------



## Nordic (Dec 1, 2022)

Is there any way to expand the power limit both up and down? It goes from -6% to 0%.



HD64G said:


> Me thinks you need to turn your OS partition into a GPT one first. Look there: https://www.diskpart.com/gpt-mbr/gpt-format-windows-10-0725.html


Thank you. That worked!


----------



## Kabouter Plop (Dec 3, 2022)

GPT is required for uefi mbr is legacy that does not support resizeable bar


----------



## GamerGuy (Dec 14, 2022)

Though I'm still an RX 6900 XT owner, it will go into my 2nd rig. Main rig will get an XFX MERC 310 RX 7900 XTX which I'djust bought. You guys think I should start an RX 7000 series owners' club?


----------



## AusWolf (Dec 14, 2022)

GamerGuy said:


> Though I'm still an RX 6900 XT owner, it will go into my 2nd rig. Main rig will get an XFX MERC 310 RX 7900 XTX which I'djust bought. You guys think I should start an RX 7000 series owners' club?


Sure, why not.  That will be one heck of a second rig!


----------



## Lagin (Dec 16, 2022)

will FSR 2.0 work on my 6700xt? no word from them on Adrenaline updates (I asked)


----------



## GamerGuy (Dec 16, 2022)

Lagin said:


> will FSR 2.0 work on my 6700xt? no word from them on Adrenaline updates (I asked)


Sure, why not? FSR1.0 and 2.0 are meant for RX 5000 and 6000 cards (as well as other AMD cards, as well as nVidia cards), it's the upcoming FSR3.0 that is up in the air, though AMD is said to be working on it so that it'd run on modern cards, including nVidia's. IF AMD were to just focus on latest gen cards like the RX 7000 series (like nVidia, where only the RTX 4000 series which get DLSS3), they'd said it'd be easier and faster to support FSR3 with just one gen of cards.

For you to see FSR, the game has to support FSR1.x or FSR 2.x, then there should be an option in the game control panel for you to enable FSR1.x/2.x


----------



## b0uncyfr0 (Dec 16, 2022)

Which is the best card out of these three?

Gigabyte Eagle 6700XT
Sapphire Nitro 6700XT
MSI 6700XT MECH 2X OC

I have an FT02 - and investigating which cooler/heatpipes will work the best in vertical cases is be a nightmare but airflow will be top notch.

I intend to overclock as much as possible (plus apply any mods if available)


----------



## 3x0 (Dec 16, 2022)

Nitro>>>Eagle>>>>>Mech


----------



## b0uncyfr0 (Dec 16, 2022)

3x0 said:


> Nitro>>>Eagle>>>>>Mech


Oh thanks. Why is the Mech last though? It seems the most compatible with my case


----------



## Nordic (Dec 16, 2022)

I have a 6750xt Mech. It isn't loud or hot. It is msi's budget line though.


----------



## kapone32 (Dec 16, 2022)

b0uncyfr0 said:


> Oh thanks. Why is the Mech last though? It seems the most compatible with my case


It has the weakest cooler and there more be more features like fan headers and RGB missing.


----------



## 3x0 (Dec 16, 2022)

Usually least amount of effort in terms of PCB/VRM and cooling quality (fan noise and temps).


----------



## AusWolf (Dec 17, 2022)

Budget and reference designs aren't totally unusable, I might add - one just has to put up with more fan noise and/or heat, and less overclocking headroom compared to bigger, more expensive models.


----------



## Kabouter Plop (Dec 18, 2022)

Anyone playing wow in here ?
Cos i know there quite a lot of user reports of gpu driver crashes and system freezes right now.
Been testing gpu in superposition for 8 hours no issues and 12+ hours in heaven now.
Getting amazing temps at +15% power limit




Its cold here tho  don't wanna use MPT yet unless im sure its 100% gpu driver issue and my gpu being healthy
Think wow is having AMD gpu issues right now older drivers get less gpu driver crashes tho anyway so far only happens in wow but im very paranoid cos of past driver issues, new driver is way more stable tho and one of the issues got fixed was this issue the MPO flicker, i did not even realize this was the flicker they meant some users where having.
But it happens on 22.5.1 for example if hardware accelerated webviews are on


http://imgur.com/WDiYSe4


On this specific page when resizing from big to small doing it abnormally and for 2 minutes causes gpu driver time out or blackscreen








						Steam Points Shop
					

Get points for shopping on Steam or by contributing to the Steam Community. Use your points to customize your Steam presence or award fellow members of the community.




					store.steampowered.com
				




in a way whatsapp video calls has similar issue but instead it has huge slowdown and stutter and vsync being on causes drag lag issues atleast the old app did, the new app seems to have moved it within the video call window instead of the window it self, anyway its shocking no one ever reproduced whatsapp issue but i discovered some apps play a role in triggered blackscreens like rtss, if this does not run it takes considerably longer to trigger but for it to trigger you must have stutter and slow downs which may be only happening on windows 11 not windows 10.
Anyway hoping next driver hits soon with more known issues, 22.11.2 more stable then 22.5.1 technically speaking but not there yet, hopefully AMD has the right idea now whats wrong, feels like if been to stuborn to test windows 10 all this time and been suffering as result, the video linked from imgur may only be happening on windows 11 as well

edit:: ended up doing 24 hour run heaven benchmark no problem even getting no errors with 1 hour vram test right after now running port royal in a loop 63 loops 2 hour run almost.


----------



## Garlic (Dec 24, 2022)

Hi guys, anyone know the thermal pad thickness on the 6900xt Toxic?


----------



## Space Lynx (Dec 24, 2022)

Kabouter Plop said:


> Anyone playing wow in here ?
> Cos i know there quite a lot of user reports of gpu driver crashes and system freezes right now.
> Been testing gpu in superposition for 8 hours no issues and 12+ hours in heaven now.
> Getting amazing temps at +15% power limit
> ...



I have a 6800 XT and Blizzard just recently let me know I have a 3 day trial to try to out the new expansion, never bought it, but I was planning to give the 3 day trial a chance soon. I may do that. I will report back if I have any issues or not.


----------



## bobalazs (Dec 25, 2022)

AusWolf said:


> Budget and reference designs aren't totally unusable, I might add - one just has to put up with more fan noise and/or heat, and less overclocking headroom compared to bigger, more expensive models.


I have reference 6700 XT and undervolted Voltage Vmax 1050 in MorePowerTool (power tab) then use overclock in radeon software when ingame (per game profile) and set 2600 Mhz min to 2700 Mhz gpu clocks. Result is cool, less power (always below 155 watts), fan are quiet, and generally very satisfying, stable.


----------



## Kabouter Plop (Dec 27, 2022)

Space Lynx said:


> I have a 6800 XT and Blizzard just recently let me know I have a 3 day trial to try to out the new expansion, never bought it, but I was planning to give the 3 day trial a chance soon. I may do that. I will report back if I have any issues or not.



Expect gpu driver freezes and crashes there many reports of this atleast on latest gpu drivers, but dont let that stop you from trying Dragonflight is surprisingly good so far.


----------



## b0uncyfr0 (Dec 30, 2022)

Thanks alot for the info fellas : i ended up with a used 6600xt for cheap. Its also performing quite well in my FT02. Seem's i got lucky because my best gfx score puts me at number 5 for all 6600xt's : https://www.3dmark.com/fs/29159398

Im currently at 2900/2300 stable. Ive also applied Igor's adjustment's :

Power limit to 160 -- Is it worth going higher? Its definitely pulilng more than 160 according to Afterburner
TDC limits - 127 and 25. I probably shouldnt play with these too much?
MVDD at 1400 - Go to 1450 or is that a bad idea?
And ofc the FCLK AT 1980 - Whats the consensus on going above 2000?

It averages around 65 degree's and the hotspot doesn't go above 95 so i think i have some more room to work with.


----------



## 3x0 (Dec 30, 2022)

@b0uncyfr0 Here are my 3DMark results (from when I had 5600X). 2833/2933 Core, 2250 VRAM, 155W in MPT, 186W max +20% offset, FCLK 2000, 130A GFX (default on my card) and 25A SoC.
For me MVDD 1400 didn't improve stability for higher VRAM clock. Going above 190W PPT isn't beneficial, could even be detrimental in some temp limited situations.


----------



## AusWolf (Dec 30, 2022)

b0uncyfr0 said:


> Thanks alot for the info fellas : i ended up with a used 6600xt for cheap. Its also performing quite well in my FT02. Seem's i got lucky because my best gfx score puts me at number 5 for all 6600xt's : https://www.3dmark.com/fs/29159398
> 
> Im currently at 2900/2300 stable. Ive also applied Igor's adjustment's :
> 
> ...


Congrats, I wish you a long and enjoyable use of the card! 

First of all, forget about Afterburner. Adrenalin's built-in tweaking tool serves your needs more than enough. I don't overclock (I have a reference 6750 XT which pretty much runs up to its limits at stock), but if I did, I would first apply an auto undervolt, then see how far the core clock goes. The values you asked about are advanced deep-level stuff that's best left for the card/driver to handle.


----------



## Space Lynx (Dec 30, 2022)

Kabouter Plop said:


> Expect gpu driver freezes and crashes there many reports of this atleast on latest gpu drivers, but dont let that stop you from trying Dragonflight is surprisingly good so far.



just letting you know I am on day 2 of my WoW Dragonflight free trial, with my 6800 xt, I have had 0 crashes and everything has been rock solid, completely maxed out with vsync on (raytracing is off) 165 fps 165hz 1440p.

not a single crash in about 10 hours or so of gameplay, I am really enjoying leveling my Drakthyr or w.e it is called. I don't think I will main it, but its been a fun little trial. I am still not sure yet if I will pay the 50 for the expansion or not yet, if the story hooks me we will see, I still have 3 levels to go or a day and half of time left... before I cap out.

so if it impresses me by then I will buy it, I do think it looks great though. WoW graphics have come a long way. I like the unique look of it still, especially being as polished as it is.

I am on a clean install of Win 11 and latest drivers.


----------



## Kabouter Plop (Dec 30, 2022)

Space Lynx said:


> just letting you know I am on day 2 of my WoW Dragonflight free trial, with my 6800 xt, I have had 0 crashes and everything has been rock solid, completely maxed out with vsync on (raytracing is off) 165 fps 165hz 1440p.
> 
> not a single crash in about 10 hours or so of gameplay, I am really enjoying leveling my Drakthyr or w.e it is called. I don't think I will main it, but its been a fun little trial. I am still not sure yet if I will pay the 50 for the expansion or not yet, if the story hooks me we will see, I still have 3 levels to go or a day and half of time left... before I cap out.
> 
> ...



If had no crashes for the first 3 days my self either its pretty random you may even get no crashes for a week and then start crashing dialy suddenly anyway hopefully you wont run into that... anyway chances are AMD will list known issue since its common issue, reports if seen are often on full AMD rig.


----------



## b0uncyfr0 (Dec 30, 2022)

AusWolf said:


> I would first apply an auto undervolt, then see how far the core clock goes. The values you asked about are advanced deep-level stuff that's best left for the card/driver to handle.



Thanks. You were right - Once i undervolted, that pushed me into the 32k range. The only issues i have atm with adrenaline are :

It forgets all tuning settings when rebooted - whyyy...
Fan profiles dont work - another bummer.

Little annoyances i hope i can figure out..Adrenaline is def better than Afterburner for me atm.



3x0 said:


> @b0uncyfr0 Here are my 3DMark results (from when I had 5600X). 2833/2933 Core, 2250 VRAM, 155W in MPT, 186W max +20% offset, FCLK 2000, 130A GFX (default on my card) and 25A SoC.
> For me MVDD 1400 didn't improve stability for higher VRAM clock. Going above 190W PPT isn't beneficial, could even be detrimental in some temp limited situations.


Oh Interesting - your core clock's are extremely good 

It seems my max core is just under 2900 in adrenaline. Giving it 1.075v (1.05v is also stable), makes no difference

This seem's to be my current sweet spot. In game. It's hovering just under 2850 when gaming.
Core Min/Max - 2775/2885
Mem - 2260 (Fast timings)
Vcore - 1.05
Power limit +20 & the additional mods from morepowertool

Lets see if i can squeeze more of the core. Ill compare some runs against your scores


----------



## AusWolf (Dec 30, 2022)

b0uncyfr0 said:


> It forgets all tuning settings when rebooted - whyyy...
> Fan profiles dont work - another bummer.


That's strange. I've never had those before. Which version are you using? What model is your card?


----------



## 3x0 (Dec 30, 2022)

b0uncyfr0 said:


> It forgets all tuning settings when rebooted - whyyy...
> Fan profiles dont work - another bummer.


For me, it only resets the settings if Windows crashed or didn't shutdown properly. Check Event Viewer if there are any shutdown/restart related issues.

As for fan profiles, my issue was that I couldn't lower the fan speed below 47% (I think) on my Gigabyte Gaming OC Pro. You can try playing around in Fan settings/limits in MorePowerTool.

Regarding my GPU frequency, I haven't undervolted- kept the stock 1150mV, might help you achieve higher clocks if you're not temperature limited.


----------



## Tropick (Dec 30, 2022)

b0uncyfr0 said:


> Thanks. You were right - Once i undervolted, that pushed me into the 32k range. The only issues i have atm with adrenaline are :
> 
> It forgets all tuning settings when rebooted - whyyy...
> Fan profiles dont work - another bummer.
> ...



That's so strange, I was having the exact same issues with my 6950XT for a while until it just suddenly decided to work flawlessly one day. Wouldn't save power target or clock target but it would save my memory speed and voltage settings. Also would sometimes randomly change it to the Rage Mode profile..

Then one day it just decided to start saving everything properly and I haven't had an issue since, have no clue what coaxed it into working though...


----------



## b0uncyfr0 (Dec 30, 2022)

AusWolf said:


> That's strange. I've never had those before. Which version are you using? What model is your card?


The latest adrenaline - 22.11.2 -  Hellhound 6600xt : https://www.powercolor.com/product?id=1623916068

Im on the OC bios - i doubt its that, but ill try the quiet bios. Maybe that'll play ball.

Edit: Ok, that didnt work but i figured out the profiles. Seem's adrenaline was adding the fan speed's from the bios ontop of what was shown. Reducing the limits fixes that.

Edit 2 : Fixed the tuning not kicking in ; had to remove afterburner 



3x0 said:


> For me, it only resets the settings if Windows crashed or didn't shutdown properly. Check Event Viewer if there are any shutdown/restart related issues.
> 
> As for fan profiles, my issue was that I couldn't lower the fan speed below 47% (I think) on my Gigabyte Gaming OC Pro. You can try playing around in Fan settings/limits in MorePowerTool.


I dont see any issues in windows event's. I cant lower it past 35% but thats fine - id just like to control it.

I saw on other forums disabling fast boot might fix it - unfortunately it didnt do anything for me.


----------



## Nordic (Dec 30, 2022)

I have been really happy with my MSI Mech OC 6750 XT. Some warned me it would be loud because it was the low end version. It is very quiet even when stressed. I don't think the 6750 XT produces enough heat for this cooler.


----------



## b0uncyfr0 (Dec 31, 2022)

Hmm, im alil confused - adrenaline has `1.015` locked into the profile (global). Its applied. And yet - when im playing Zero Dawn, im seeing the core voltage go as high as 1.085 (according to hwinfo)

Am i missing something here?


----------



## 3x0 (Dec 31, 2022)

Voltage in the driver is only a "target" and not a hard limit. Specify your voltage in MPT if you want to hard limit it to an upper value.


----------



## AusWolf (Dec 31, 2022)

b0uncyfr0 said:


> The latest adrenaline - 22.11.2 -  Hellhound 6600xt : https://www.powercolor.com/product?id=1623916068
> 
> Im on the OC bios - i doubt its that, but ill try the quiet bios. Maybe that'll play ball.
> 
> ...


I'm glad you solved it. 

*A message for everyone:* Guys, never ever use Afterburner, or any other third-party software on Radeon cards! Unlike Nvidia's text editor from Windows 95, the AMD driver has everything you need.


----------



## Lost_Troll (Dec 31, 2022)

Kabouter Plop said:


> If had no crashes for the first 3 days my self either its pretty random you may even get no crashes for a week and then start crashing dialy suddenly anyway hopefully you wont run into that... anyway chances are AMD will list known issue since its common issue, reports if seen are often on full AMD rig.



I found this channel (Ancient Gameplays) on YouTube, and he seems to have more on how to fix issues with AMD drivers then with any other brand. Here is his latest video with 15 tips on how to fix the most common issues, like using AMD's driver uninstall tool before you install newer drivers, which I never knew they had their own tool for this.


----------



## Terronium-12 (Jan 4, 2023)

*So I've been thinking more and more about delving into MPT, but I don't know...if it's worth it?

Example, the highest I can clock my 6900 is 2700/2800 on the core and 2110 on the memory on air. Anything over is unstable, would MPT let me push either further and be stable? I also don't know the extent of what it can and can't do/what I should and shouldn't mess with. Will I be going down another overlocking rabbit hole?

I don't mind if the answer to that is 'yes', I just want to know what I'm in for. I have a general idea but I'd love to hear from any of you currently using it!*


----------



## puma99dk| (Monday at 7:19 PM)

Lost_Troll said:


> I found this channel (Ancient Gameplays) on YouTube, and he seems to have more on how to fix issues with AMD drivers then with any other brand. Here is his latest video with 15 tips on how to fix the most common issues, like using AMD's driver uninstall tool before you install newer drivers, which I never knew they had their own tool for this.



Don't give Mr Plop any good ideas so he stop plaining it will ruin his life because he refuses to understand that it might be his oc, card, os or maybe another drive that gives him issues he needs a life and that's bothering us with this complains....


----------



## droopyRO (Tuesday at 7:06 PM)

What are your thoughts on this ?









It seems very strange that a driver could do that and why are so many people trying to repair their cards instead of RMA ?


----------



## Vya Domus (Tuesday at 7:48 PM)

droopyRO said:


> why are so many people trying to repair their cards instead of RMA ?



Bought used/through scalpers so no warranty ?


----------



## Kabouter Plop (Tuesday at 8:07 PM)

droopyRO said:


> What are your thoughts on this ?
> 
> 
> 
> ...



Haha rip my card i guess its having gpu core crack cos AMD is pushing it beyond limits
Im having all these driver issues as well
Easy to reproduce blackscreens especially when resizing steam on older drivers or whatsapp video call the later one if not seen ayone else reproduce but im 100% sure i could reproduce this on every single AMD gpu if some one send me 10 rx 6000 series cards no problem so i know this is a driver issue already, not to mention that the proof is already in being able to produce a blackscreen using steam and resizingit on the point shop with older drivers cos it flickers like crazy.

Damn i wish i bought a evga gpu instead they can mess up and put a ticking timebomb in my PC and i would still trust them cos of their amazing customer support and warranty not that i want a ticking timebomb in my PC just cos i know they do the right thing.
AMD on the other hand trying to void warranty for removing their cooler trying to fix their problems or any overclocking even tho i put a waterblock on my 6900 XT and never overclocked it for 24/7 use.

Not to mention max core frequency limits are ignored
On default profile firing up doom eternal max core clocks 2555mhz
on auto oc it sets target about same as max core clocks
on custom gpu tuning it suggest 2479 mhz so if i set this on 2479 mhz it still boost to 2555mhz firing up doom eternal.
Then i set 2478 mhz instead fire up doom eternal and it only boost to 2463 mhz i seriously get the idea that AMD has no fucking clue on how to do proper drivers
Especially considering if been having issues for months now, it became a lot better but not stable sometimes i blackscreen randomly or while alt tabbing 5 times a day either due notification or alt tabbing.
Meanwhile on Manjaro linux im fully stable
And can even run heaven for 24 hours straight then superposition for another 8 hours and port royal for another 8 hours.
Maybe i should send my gpu over for investigation to this guy but im not really in the mood to take this out and re assemble it with stock cooler especially considering the fact that these vaporchambers might have a fatal flaw maybe where it thermally expands perhaps and then cracks the die maybe or something else.
Not to mention that AMD is giving zero warranty cos of warranty void stickers and them trying to ask every chance they get if you are overclocking the card even tho my last 2 cards both heavily overclocked and watercooled for 5 and 6 years are still alive.

Anyway no way i want to be without gpu hopefully in 1 and half month i have enough for rtx 4080 with waterblock preinstalled cos im sure not trusting AMD right now, i wonder how the custom cards will do tho.
Its so awkward to hate both AMD Intel and Nvidia right now, i wish AMD was like EVGA and just contacted me asked my details send me new card that i could put in right away as replacement and then send my old card as it is to them for investigation whats wrong with it, preferably again with waterblock pre installed seriously im tired of disassembling and reassembling stuff, if damaged my gtx 480 like this had a cap fall off but EVGA had no problem fixing it even tho i told them it was user damage.

Meh i hate AMD wish they just decided to do the right thing, im just glad i am not alone with all other issues if been having.
Like world of warcraft freezing the system,  with 20+ reports from mostly AMD systems and 3 Nvidia systems.

Hopefully AMD releases a new driver really soon fixing all these issues before it becomes bigger problem, i probably keep money on reserve for future gpu replacement if they manage to fix this mess before i decide to upgrade again.

Damn my chest hurts....


----------



## GamerGuy (Tuesday at 8:08 PM)

Vya Domus said:


> Bought used/through scalpers so no warranty ?


Yep, the warranty on my RX 6900 XT just ran out, late December 2022 IIRC. So, yeah, could be used abused cards bought through other sources.


----------



## AusWolf (Tuesday at 8:31 PM)

droopyRO said:


> What are your thoughts on this ?
> 
> 
> 
> ...


He said it himself: "we need more information".


----------



## droopyRO (Tuesday at 10:24 PM)

I'm not pointing fingers, just a heads up.


GamerGuy said:


> Yep, the warranty on my RX 6900 XT just ran out, late December 2022 IIRC. So, yeah, could be used abused cards bought through other sources.


In Romania most producers like MSI, Gigabyte or Asus have 3 years warranty except Sapphire and Powercolor(on some GPUs) for some reason.


Vya Domus said:


> Bought used/through scalpers so no warranty ?


Bought second hand or from a more dubious source are a possibility, but i think the warranty is transferable in most places.


----------



## GamerGuy (Tuesday at 10:46 PM)

droopyRO said:


> Bought second hand or from a more dubious source are a possibility, but i think the warranty is transferable in most places.


If that is so, why send GPU's to some random shop for repair? Why not RMA?


----------



## Kabouter Plop (Wednesday at 1:05 AM)

My card is still under warranty till may atleast i believe, i hope there gonna be huge recall and bunch refunds even for cards already died if they wheren't given already, but knowning AMD they probably stay silent and not say anything, wonder if my GPU is at risk since its watercooled, i wonder if any cards that are watercooled died.
Not first time if seen drivers kill gpu's Nvidia had issue with fan profiles it would just break and stay at 0 rpm i believe, hopefully its such an issue that caused this then i do not have to worry about my gpu dying altho it probably is much worse then this.


----------



## AusWolf (Wednesday at 2:15 AM)

droopyRO said:


> I'm not pointing fingers, just a heads up.


I know, thanks for that. The video says it's 6800 and 6900 cards, but I'll be careful with my 6750 XT and monitor it while gaming, just in case. Hopefully, it won't affect me (fingers crossed).

Edit: Also, the video doesn't say if the cards were overclocked or not. Maybe Wattman is broken in the new driver and the cards are fine at stock? (Just speculating)

I've had 22.11.2 installed since it came out. No issues so far.


----------

