• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

The RX 6000 series Owners' Club

Nickel-plated Copper would not have issues with LM, only copper and aluminum have issues with LM, aluminum has most problem with LM, copper a bit less, and nickel plated basically none, except maybe some staining but not corrosion. In the past I also used LM on a copper heatsink, id did some very minor damage, but nothing to bad. On nickel LM is almost harmless in general.
The effect of liquid metal on copper is merely cosmetic. Functionally, temps should not be affected by years of use.
 
The effect of liquid metal on copper is merely cosmetic. Functionally, temps should not be affected by years of use.
yup, those where my observation too after using LM on a copper based aio for 2 years.
 
I got the best idea ever from this forum, buying warranty stickers, there is a discussion on the topic here :D . The only thing that would be worrisome for me with the reference cards is that at least for the 6800XT, they come with a thermal pad, not paste, and that is hardly salvageable, that could be a reason for warranty rejection, but then again, if the stickers are intact, you can make the case that's how you got it, I know, its far-fetched. Another issue with the reference cards, and it happened with mine, I tried to put paste on it, removed the pad, had the unpleasant surprise that the cold plate was not very even and I could not get hotspot under control anymore, that made me get a waterblock. I was relatively happy with the stock cooler, but I "needed" to thinker with it :laugh:

From looking at your settings, they are not overly aggressive, your problem would be the reference cooler has its limits, and if it also came with a thermal pad and not paste, that would not allow it to much OC room anyway. You could try to replace the paste, but that might not work, did not work for me, and the cooler itself is perfectly fine for stock settings, but not really for hard OC's.
Appreciate your inputs. You seem to have tinkered a lot with the card. Do you have any idea why fan speed never goes up to 100% or 3300 RPM? For me, the max fan speed percentage is 93% in GPU-Z, and max rpm is around ~3080.
 
@Godhand007, you should try “undervolting” the 6900 XT to get better performance.
 
Appreciate your inputs. You seem to have tinkered a lot with the card. Do you have any idea why fan speed never goes up to 100% or 3300 RPM? For me, the max fan speed percentage is 93% in GPU-Z, and max rpm is around ~3080.
The max is just a max, sometimes fans can reach the theoretical max, sometimes not, its normal with most fans (93% if accurate seems on the lower side of things, most fans reach at least 95%). And as @Butanding1987 said, trying to undervolt might help you lower the temps, though its not a guarantee that it will also work. But does not hurt to try. I would start from 1125mv and go down in steps of 10
 
I knew you couldn’t resist it. :D
Yes I know every time I say I will not do it but really cannot resist. :roll:
It is simply the tweaker genes in me that at end overtakes control and forces me to do such things. :D

By the way, I got refund for 6800XT that I destroyed before with ESD. :D
So the price 6900XTU doesn't heart anymore as re-calcuting the money that I got back from 6800XT, I end up paying 1297€ for 6900XTU Liquid Devil Ultimate.
That is price that regular 6900XT had few months ago. Now even standard 6900XT costs 2000€.

Probably 10c but liquid metal can destroy aluminium potentially damage gpu if not used correctly, its probably also conducting electricity so you dont want it touching parts it should't altho i am not sure about that last one.
If one is careful than LM is not a big problem.
I have been using it for many years.

As @Felix123BU mention, it should not be used with Aluminum as that will be destroyed. On Copper there is chemical reaction which makes Galium inside Liquid Metal to combine with copper and permanently causing de-coloring but heatsink is not destroyed.
With Nickel, there is small staining after some time but it does not cause any major issues.

Regarding application, I filled the area around the GPU Die with MX-5 covering any exposed componenets. MX-5 is not conductive and therefore it is insulating the components in case any LM runs away.
It is also easy to remove than for example nail polish.

If one is careful with the quantity of LM then it does not run away. You need a very very tiny amount which is more than enough to cover whole GPU die.
I used Thermal Grizzly Conductonaut which comes with a tool like ear cotton swab to spread LM onto Die and if after applying you have liquid still floating around on the Die than you applied too much.
Too much is a problem as this will run away and spread around covering the area around the Die.
 
Last edited:
@Godhand007, you should try “undervolting” the 6900 XT to get better performance.
Unfortunately, Current clocks are not stable without 1175mv.

The max is just a max, sometimes fans can reach the theoretical max, sometimes not, its normal with most fans (93% if accurate seems on the lower side of things, most fans reach at least 95%). And as @Butanding1987 said, trying to undervolt might help you lower the temps, though its not a guarantee that it will also work. But does not hurt to try. I would start from 1125mv and go down in steps of 10
This is the first time I have encountered this issue. Fans for a multitude of GPU/CPUs have always reached the max for me. Someone was suggesting that this might be a quirk of RX 6900 XT reference cards but difficult to say what is true.
 

Attachments

  • 417043757_aa0e10f3-18c8-4480-a69d-f0216f111abb.jpg
    417043757_aa0e10f3-18c8-4480-a69d-f0216f111abb.jpg
    239.2 KB · Views: 90
Unfortunately, Current clocks are not stable without 1175mv.
Yes 6900XT don't like that much undervolt.
That's even the case for my XTXH card.
My 6900XTU can by default 1200mV. 1175mV is the voltage that I have been using for 2750MHz.

I got it now down to 1085mV by tightening the lower frequency limit but I still need to run stress test to see if that would always be stable. I am not too confident as my 6900XT likes higher voltages.

With my 6800XT before I could set it 2750MHz@1020mV. I still feel very sad when I remember that my super 6800XT died. :cry:

I just got owner to a https://www.amd.com/en/products/graphics/amd-radeon-rx-6800-xt

Am I in for a thrill? I sold my RTX 3090 this card was 1200$ cheaper at MSRP so I went for it.

Anyone want to share some experiences?
Welcome to the club. :cool:
6800XT are nice cards.
Interesting that you went from 3090 to 6800XT. :cool:
You are lucky that you could grab one directly from AMD. :rolleyes:
That's the only place where you can get it on MSRP.

I tried myself twice to get a new one from there after my 6800XT died with ESD shock, but it was almost impossible as they were sold out each time within seconds while I was still trying to pay for it with PayPal. :laugh:
 
Last edited:
Yes 6900XT don't like that much undervolt.
That's even the case for my XTXH card.
My 6900XTU can by default 1200mV. 1175mV is the voltage that I have been using for 2750MHz.

I got it now down to 1085mV by tightening the lower frequency limit but I still need to run stress test to see if that would always be stable. I am not too confident as my 6900XT likes higher voltages.

With my 6800XT before I could set it 2750MHz@1020mV. I still feel very sad when I remember that my super 6800XT died. :cry:
I guess I have to be ok with sustained 2600Mhz+ frequencies along with full Memory OC. I suspect I could have gotten 60-70MHz more if I was on the water, but for the price (got it at MSRP), it's more than enough performance. If RT performance was around 20-30 % better (I had a 2080ti before this) then this would have been the perfect card for me. I just hope that those clocks are stable and I don't have to lower them any further.

BTW, you got very lucky that you got a refund for a tinkered rx 6800 xt.

I just got owner to a https://www.amd.com/en/products/graphics/amd-radeon-rx-6800-xt

Am I in for a thrill? I sold my RTX 3090 this card was 1200$ cheaper at MSRP so I went for it.

Anyone want to share some experiences?
Just be sure to set your expectations reasonably (it's not the best of the best of the best card) and it should provide you more than enough performance to enjoy the latest titles with lowered RT settings.
 
So any thoughts on the new Unreal Engine 5 demo? I ran it on my system and was not impressed. Had to delete it as it along with other Unreal Engine 5 crap was taking around 200GB of prime SSD space.
 
Just be sure to set your expectations reasonably (it's not the best of the best of the best card) and it should provide you more than enough performance to enjoy the latest titles with lowered RT settings.

I actually think it was the best of the best of the best. :) 3090->6900XT->6800XT. When a RTX 3080 Ti hits the market, Nvidia might be relevant again.
 
I actually think it was the best of the best of the best. :) 3090->6900XT->6800XT. When a RTX 3080 Ti hits the market, Nvidia might be relevant again.
6900XT is the best card for 1440p raster. 3090 is overall the best.

Both companies are relevant these days because of GPU supply shortages. Prices for 3080ti models are going to be more than the reference price of 3090. Unless I am missing something, 3080ti custom models would be beaten by 6900XT custom models with the exception of RT benchmarks. So it is going to be a dud from a normal consumer's perspective.
 
Haven't really tryit pushing my 6900 XT yet it boost to 2,5 ghz out of box 2,555 with rage profile, im still at 1080p until late june when i get my 2e shot and enough money to buy samsung g9 odyssey :)
 
Haven't really tryit pushing my 6900 XT yet it boost to 2,5 ghz out of box 2,555 with rage profile, im still at 1080p until late june when i get my 2e shot and enough money to buy samsung g9 odyssey :)
Which custom model do you have and do those clocks sustain in-game? Also, when you move to 1440p or higher, you would see some reduction in those boost clocks.
 
Which custom model do you have and do those clocks sustain in-game? Also, when you move to 1440p or higher, you would see some reduction in those boost clocks.
Not really, I play at 3440x1440, which in regards to pixel density is between 2k and 4k, and I hold the exact same clocks regardless of resolution on my 6800XT, granted, its on water, but could hold almost the same clocks with the stock reference cooler, but higher temps.
 
I as mentioned sold my 3090 - since I got the 6800 XT at MSRP - 1200usd less than what I payed for my 3090 I must that the 6080 XT are a better card to be honest. Ray tracing etc needs time to mature I believe - and all the important titles like Cyberpunk 2077 etc etc I have completed. Now I play mostly AC Valhalla and games like Forza Horizon 4 etc, Watch Dogs Legion etc and those games excel on the AMD card. However, if I could get a good price and have money for it I might snack a 3080 Ti. I think the 3080s vram amount are going to be an issue later on.

I play on the Odyssey G9 5120x1440p. I have high expectations to the following: Down scaled image to 1800p can look as good as 4K giving like a 30% boost for free and actually a lot of reviews says it's better than DLSS. As my card is an AMD Radeon RX 6800 XT Black Edition I'm so lucky that Trixx works and that should be even better than AMD's only Image Sharpening.

The only real thing I can complain about are the 40+mhs lost on Eth mining. That is a bit annoying.
 
Last edited:
Which custom model do you have and do those clocks sustain in-game? Also, when you move to 1440p or higher, you would see some reduction in those boost clocks.

Its just a AMD gpu directly from amd.com
 
I as mentioned sold my 3090 - since I got the 6800 XT at MSRP - 1200usd less than what I payed for my 3090 I must that the 6080 XT are a better card to be honest. Ray tracing etc needs time to mature I believe - and all the important titles like Cyberpunk 2077 etc etc I have completed. Now I play mostly AC Valhalla and games like Forza Horizon 4 etc, Watch Dogs Legion etc and those games excel on the AMD card. However, if I could get a good price and have money for it I might snack a 3080 Ti. I think the 3080s vram amount are going to be an issue later on.

I play on the Odyssey G9 5120x1440p. I have high expectations to the following: Down scaled image to 1800p can look as good as 4K giving like a 30% boost for free and actually a lot of reviews says it's better than DLSS. As my card is an AMD Radeon RX 6800 XT Black Edition I'm so lucky that Trixx works and that should be even better than AMD's only Image Sharpening.

The only real thing I can complain about are the 40+mhs lost on Eth mining. That is a bit annoying.
:laugh: yeah, the 6800XT is not as good for mining, but speaking from own experience, I get 62mhs at 130w, its quite power efficient even though not as much of a hash monster as Ampere
 
Not really, I play at 3440x1440, which in regards to pixel density is between 2k and 4k, and I hold the exact same clocks regardless of resolution on my 6800XT, granted, its on water, but could hold almost the same clocks with the stock reference cooler, but higher temps.
Have to disagree on that. As resolution increases clocks go down for me, not by a huge margin but certainly not insignificant. Temps remain mostly the same. Also, I believe this is the case for almost all modern cards i.e. clocks go down on higher resolutions. That water block might be the thing making a difference in your case though.

I as mentioned sold my 3090 - since I got the 6800 XT at MSRP - 1200usd less than what I payed for my 3090 I must that the 6080 XT are a better card to be honest. Ray tracing etc needs time to mature I believe - and all the important titles like Cyberpunk 2077 etc etc I have completed. Now I play mostly AC Valhalla and games like Forza Horizon 4 etc, Watch Dogs Legion etc and those games excel on the AMD card. However, if I could get a good price and have money for it I might snack a 3080 Ti. I think the 3080s vram amount are going to be an issue later on.

I play on the Odyssey G9 5120x1440p. I have high expectations to the following: Down scaled image to 1800p can look as good as 4K giving like a 30% boost for free and actually a lot of reviews says it's better than DLSS. As my card is an AMD Radeon RX 6800 XT Black Edition I'm so lucky that Trixx works and that should be even better than AMD's only Image Sharpening.

The only real thing I can complain about are the 40+mhs lost on Eth mining. That is a bit annoying.
Well if you are happy with your card then that is one of the most important things checked already. Though I must say that one can always justify their purchase based on the games that they play, but overall
3090 is a better card (but twice as costly as well). I bought an RX 6900 XT because I wanted the best performance at 1440p. Nvidia was not a clear winner here and with the OC potential of RX 6900 XT, I knew I made the right decision or at least the most optimum decsion. The lack of RT performance does hurt a bit though.

BTW, Is Watch Dogs Legion really performing better on AMD hardware? I presume you are talking about performance with RT off.
 
Have to disagree on that. As resolution increases clocks go down for me, not by a huge margin but certainly not insignificant. Temps remain mostly the same. Also, I believe this is the case for almost all modern cards i.e. clocks go down on higher resolutions. That water block might be the thing making a difference in your case though.


Well if you are happy with your card then that is one of the most important things checked already. Though I must say that one can always justify their purchase based on the games that they play, but overall
3090 is a better card (but twice as costly as well). I bought an RX 6900 XT because I wanted the best performance at 1440p. Nvidia was not a clear winner here and with the OC potential of RX 6900 XT, I knew I made the right decision or at least the most optimum decsion. The lack of RT performance does hurt a bit though.

BTW, Is Watch Dogs Legion really performing better on AMD hardware? I presume you are talking about performance with RT off.

Happy and happy... you know that is big words. I wasn't 100 percent happy with the performance of the 3090 even overclocked and I think that Image scaling with Trixx https://www.sapphiretech.com/en/software and AMD Images sharpening will make up for the fps I lose in some games. I don't know exactly how it performs at the same settings as the 3090. - I will recv. the card tomorrow. But I have bought the card because I have some expectations to AMD's technology advantages in some situations. And also I really didn't needed 24GB Vram yet. I think 16GB makes sense as many games I played on the 3090 stored more than 10GB into buffer in some games. Other games like AC Valhalla trips around 9700ish so it's very close to the 10GB "limit" of a 3080. Also, the scalper prices for NVIDIA cards are ridiculous and getting a card MSRP these days are simply luck :)
 
Last edited:
Happy and happy... you know that is big words. I wasn't 100 percent happy with the performance of the 3090 even overclocked and I think that Image scaling with Trixx https://www.sapphiretech.com/en/software and AMD Images sharpening will make up for the fps I lose in some games. I don't know exactly how it performs at the same settings as the 3090. - I will recv. the card tomorrow. But I have bought the card because I have some expectations to AMD's technology advantages in some situations. And also I really didn't needed 24GB Vram yet. I think 16GB makes sense as many games I played on the 3090 stored more than 10GB into buffer in some games. Other games like AC Valhalla trips around 9700ish so it's very close to the 10GB "limit" of a 3080. Also, the scalper prices for NVIDIA cards are ridiculous and getting a card MSRP these days are simply luck :)
What were you unhappy with (OC 3090)? Can you give some specific examples?

Sharpening tech is avaiable on Nvidia as well. Also, 24GB VRAM is not why Gamers buy RTX 3090, it's for the best 4K performance.

" I have some expectations to AMD's technology advantages in some situations": That is one of the reasons I had for choosing AMD as well. When the performance between two companies is so close, even minor advantages can become meaningful. Take the example of SAM vs Re-Sizable bar, AMD benefits more from SAM than Nvidia from the Re-Sizable bar.
 
What were you unhappy with (OC 3090)? Can you give some specific examples?

Sharpening tech is avaiable on Nvidia as well. Also, 24GB VRAM is not why Gamers buy RTX 3090, it's for the best 4K performance.

" I have some expectations to AMD's technology advantages in some situations": That is one of the reasons I had for choosing AMD as well. When the performance between two companies is so close, even minor advantages can become meaningful. Take the example of SAM vs Re-Sizable bar, AMD benefits more from SAM than Nvidia from the Re-Sizable bar.
Yeah that is what I was not happy with. In AC Valhalla the fps before re-sizeable bar could dib below 60 for a second or two fucking up G-Sync on the Odyssey G9. Since it's also has FreeSync Premium Pro I hope it will be a good experience.
 
Just watched AMD's keynote, FSR looks quite good, lets see how and when it gets implemented in actual games, I hope Cyberpunk would be between the first ones :)
 
Back
Top