# AMD RX 7000 series GPU Owners' Club



## GamerGuy (Dec 15, 2022)

Okay guys, gonna start a similar thread to the RX 6000 owners' club. Pretty sure I ain't the first here, not even with an AIB card. I did say that when I do get an RX 7900 card, it'd be from Sapphire, PowerColor, XFX and perhaps ASRock. So, here it is, an XFX MERC 310 RX 7900 XTX. Pardon the crappy pics, my friend took them for me as I'm still overseas and he'd had to pick it up for me yesterday back home.










Here's a pic of it showing the 3x 8pib PCIe power connectors, one thing I know for sure, I don't have to worry about them melting. Oh yeah, not my friend holding it, those grubby hands molesting my card are those of the staff of the shop.




I'll only be back on the 2nd week of February.....so why did I get it now? I blame the net and online purchases, makes it too easy and convenient to do crazy stuff like this.


----------



## shovenose (Dec 15, 2022)

Good afternoon folks, my XFX AMD Reference Edition RX 7900 XT is here, and after trying it out, I'm very happy with my purchase.

Microsoft Flight Simulator 2020 is nice and playable at Ultra but has an occasional dip perhaps due to the GPU perhaps due to my CPU, not sure. I put it down to High and it's very smooth like that and still such a gorgeous game.

Other games I tried so far are GTA V, Stray, and BeamNG Drive. All maxed out, all perfect.

Overall, if you want a card for 4K60 gaming, I recommend it for sure!

The reference design feels nicely built, and the fans are pretty quiet even in MSFS2020, although my case, Asus AP201, is pretty much mesh all the way around so it's not baking in there, so your experience could be different. Coil whine under load is minimal although present, but with V-Sync enabled cuts it down drastically and I always use that so I have no complaints. I also had slight coil whine with my previous card, the RX 6600, so it could be the PSU, which is not the best in the world.

Overall I give it 4/5 stars, the XTX is just a better value when you calculate performance per dollar, hence why I'm only rating it 4/5.


----------



## Kabouter Plop (Dec 19, 2022)

Where the watercooled custom cards tho preferably with a waterblock preinstalled from ekwb


----------



## GamerGuy (Dec 19, 2022)

I'd love to see someone come in with the  *ASRock Radeon RX 7900 XTX Aqua*, that's one secksay, and expensive!!!, beast.


----------



## Kabouter Plop (Dec 21, 2022)

GamerGuy said:


> I'd love to see someone come in with the  *ASRock Radeon RX 7900 XTX Aqua*, that's one secksay, and expensive!!!, beast.



If there no other watercooled cards and i am upgrading it be my first choice but only if AMD is being honest about known issues



This issue if had since  may 2021 and oh boy did i nag a lot to AMD now if only they can be honest about other issues then i can just trust AMD they least trying to fix problems, instead of shoving them under a rug.
Btw i switched to AMD cos of driver issues with Nvidia flickering tree's and such that they put bandaid on since bfa and now has appearntly made a return once again, i hate driver hell, hopefully they start listing way more issues next driver, i don't care much that its an issue what i care about is that they make everyone aware else how can i even trust them.


----------



## Theswweet (Dec 24, 2022)

I'm away from my desktop for the holidays, but so far I've been happy with my 7900xtx. Playing some of the games on my backlog like GotG with the FSR2 mod and the FSR mod to get 100+ FPS on max settings with RT. Hopefully driver updates in the new year will improve things even further.


----------



## Kabouter Plop (Dec 30, 2022)

Make sure your display port cables are oke btw guys

__
		https://www.reddit.com/r/Amd/comments/zxcjji


----------



## Theswweet (Jan 1, 2023)

Der8auer is convinced that the issue is faulty vapor chambers. He rigorously tested the issue and can't find any other explanation...


----------



## gffermari (Jan 1, 2023)

The ''poor'' performance can be fixed with a price change. But this.
We have to see if it's happening in most cards that have been sold.


----------



## Kamildos (Jan 1, 2023)

got a gigabyte 7900xt oc and im disappointed should have stayed with my toxic 6900 ...having trouble playing rl with more  then 500 fps  and the gpu is at 100 % this thing is running hot radeon sofware constantly time out so now im running just the the driver not a single crash since then and yes ddu and all that stuff has been done  im tire of this *new * trend.... lets sell unfinish products companies don't give a *** anymore because everybody keep buying unfinish products


----------



## Vya Domus (Jan 1, 2023)

Reference Sapphire 7900XT here.



Theswweet said:


> Der8auer is convinced that the issue is faulty vapor chambers. He rigorously tested the issue and can't find any other explanation...


It makes no sense to me why this issue is not encountered also on the 7900XT, on my reference card I don't think I've even saw anything above 85C hotspot temperature pulling upwards of 450W> at times no less, there is never a difference between GPU temp and hotspot greater than 15C, I find it hard to believe a couple of extra CUs can make that much of a difference and the coolers don't differ that much.


----------



## Jaffakeik (Jan 1, 2023)

very happy with my card it keeps under 80c on 4k ultra gaming 144 with default profile.


----------



## ratirt (Jan 1, 2023)

Vya Domus said:


> Reference Sapphire 7900XT here.
> 
> 
> It makes no sense to me why this issue is not encountered also on the 7900XT, on my reference card I don't think I've even saw anything above 85C hotspot temperature pulling upwards of 450W> at times no less, there is never a difference between GPU temp and hotspot greater than 15C, I find it hard to believe a couple of extra CUs can make that much of a difference and the coolers don't differ that much.


How much did you pay for the card where you live? I'm looking at the prices in Norway and it is looking kinda grim. 
Although, there wasn't a lot of cards at the release so maybe it will drop after a while. 

With the XTX and the hot spot temp, the XTX is a bigger card so shouldn't that compensate for the higher heat than the 7900xt?


----------



## LifeOnMars (Jan 1, 2023)

Jaffakeik said:


> View attachment 277035
> 
> very happy with my card it keeps under 80c on 4k ultra gaming 144 with default profile.


I know you're gaming at 4K but realistically, how well does the 2700X hold up in pushing the card, even at that res? I know it depends on the game (being a 4K gamer myself) but some games even at 4K still need requisite CPU grunt.


----------



## Jaffakeik (Jan 1, 2023)

LifeOnMars said:


> I know you're gaming at 4K but realistically, how well does the 2700X hold up in pushing the card, even at that res? I know it depends on the game (being a 4K gamer myself) but some games even at 4K still need requisite CPU grunt.


what do u mean by 2700x? I'm bit confused here. U mean like ryzen cpu? I'm on intel i9 12900K

my specs not updated since 2016 probably


----------



## AsRock (Jan 1, 2023)

Jaffakeik said:


> what do u mean by 2700x? I'm bit confused here. U mean like ryzen cpu? I'm on intel i9 12900K
> 
> my specs not updated since 2016 probably



Your Specs sound like they need updating.


----------



## Jaffakeik (Jan 1, 2023)

AsRock said:


> Your Specs sound like they need updating.


I updated them just now


----------



## kapone32 (Jan 1, 2023)

So my Friend let me know that I can get a 7900XTX and I am getting the Sapphire Pulse 7900XTX. I do have a question though. I am looking at a Waterblock for the card as the size is too big. On the Alphacool website they have Sapphire blocks for the Nitro+ and Toxic cards but there is no Toxic card that has been released. I read the manuals and looked at as many images as possible but I do believe that the block for the Toxic is indeed the block that will work on the Pulse card. I just don't want to spend $180 for nothing. I need to to know if this can be verified.


----------



## LifeOnMars (Jan 1, 2023)

Jaffakeik said:


> what do u mean by 2700x? I'm bit confused here. U mean like ryzen cpu? I'm on intel i9 12900K
> 
> my specs not updated since 2016 probably


Yeh sorry, only had your system specs on the left to go by. 12900K is more than enough  Enjoy!


----------



## Vya Domus (Jan 1, 2023)

ratirt said:


> How much did you pay for the card where you live? I'm looking at the prices in Norway and it is looking kinda grim.
> Although, there wasn't a lot of cards at the release so maybe it will drop after a while.



It is grim, about 1000 euro and the XTX was almost 1300. I don't think they'll drop, this is pretty much part of the course where I live, everything is about 20% over USD MSRP.


ratirt said:


> With the XTX and the hot spot temp, the XTX is a bigger card so shouldn't that compensate for the higher heat than the 7900xt?



That does not explain a gap of more than 30C between GPU temp and hotspot. If we're assuming something is wrong with the cooler then both temps should be significantly worse not just the hot spot.


----------



## Kabouter Plop (Jan 1, 2023)

There was a case of 1 user having high hotspot temp due display port cable used and ssd not being detected with the bad cable as well, anyway looks like recall may be needed


----------



## GamerGuy (Jan 1, 2023)

Awesome to see this thread getting some traction, I remember for the RX 6000 thread, I had to cajole users for response....but heck, I'd even forgotten about the thread initially myself.  

Now, I can't wait to get back to build my 2nd rig, and to install my MERC 310 RX 7900 XTX into my main rig.....fun time ahead!


----------



## GamerGuy (Jan 4, 2023)

Anyone seen this yet? Opinion?


----------



## Vya Domus (Jan 4, 2023)

GamerGuy said:


> Anyone seen this yet? Opinion?



As bizzare as that sounds it seems much more plausible to me.

As I pointed out many times a defective vapor chamber would mean higher temperatures across the board. Maybe whatever display controller circuitry there is on the chip it overheats when one of those cables are connected.


----------



## 3x0 (Jan 4, 2023)

Vya Domus said:


> As bizzare as that sounds it seems much more plausible to me.
> 
> As I pointed out many times a defective vapor chamber would mean higher temperatures across the board. Maybe whatever display controller circuitry on the chip there is overheats when one of those cables are connected.


Then how would changing the orientation of the GPU fix or break the issue if it's a faulty cable?


----------



## Vya Domus (Jan 4, 2023)

3x0 said:


> Then how would changing the orientation of the GPU fix or break the issue if it's a faulty cable?



No idea, I also don't understand why putting the GPU back doesn't cause the temperature to also go back. But it seems increasingly clear to me that it's not the cooler at fault here.

Maybe the cable is somewhat loose so when the orientation changes some pins don't make proper contact, I don't know. It's strange either way.


----------



## 3x0 (Jan 4, 2023)

What seems increasingly clear to me is that the DP cable and vapor chamber are two different issues resulting in the same outcome.


----------



## tabascosauz (Jan 4, 2023)

Vya Domus said:


> No idea, I also don't understand why putting the GPU back doesn't cause the temperature to also go back. But it seems increasingly clear to me that it's not the cooler at fault here.
> 
> Maybe the cable is somewhat loose so when the orientation changes some pins don't make proper contact, I don't know. It's strange either way.



How's the MBA cooler on your XT? No problems? It's a bit late for me to have second thoughts about it, but I can't find any word on whether the XT suffers from the same 110C problem, maybe due to how few people have the XT.

Vertical mount in my case, but I don't want the problem to come back and bite me in the ass once warranty is over, just because I didn't go out of my way to test it in horizontal orientation.


----------



## Theswweet (Jan 5, 2023)

So... turns out that enabling anti-lag stopped my crashes/driver timeouts with Witcher 3. How? I don't know, but I can enjoy the game now. Yay?


----------



## BATOFF3 (Jan 5, 2023)

Sapphire MBA 7900XT reference owner here. No issues with the cooler. In a 22c Air conditioned room I havent seen it break 60c on the hotspot.
At idle is does 32c gpu 35c Junction.
Happy with this purchase, tho My wallet wasnt. lol
Have had it for a week no issues


----------



## tabascosauz (Jan 5, 2023)

BATOFF3 said:


> Happy with this purchase, tho My wallet wasnt. lol



I have a feeling this will also be me in 2 weeks   

Single monitor though? I think I will be running into the multi monitor power issue.......can't wait to see 100W GPU idle


----------



## BATOFF3 (Jan 5, 2023)

Single Monitor...dont have the space for anymore on My desk, plus I gave up using eyefinity back around 2009 when i ran 2 HD4870's in crossfire. lol


----------



## AusWolf (Jan 5, 2023)

I've got a favour to ask.

Has any of you got an MBA card with the heat issued described in the article posted a couple days ago? If so, can you run a 3DMark Time Spy or Time Spy Extreme stress test and post the end result? Youtube videos seem to claim that the card is throttling when it hits 110 °C on the hotspot without explaining what they mean by "throttling". As we know, AMD cards don't have predetermined boost bins, so the term "throttling" means nothing to me here. I'm particularly interested in the end score, and the difference between the best and worst runs.

I'm not looking for a fight. I just want to understand the situation better.

Thanks.


----------



## BATOFF3 (Jan 5, 2023)

Hope this helps.


----------



## tabascosauz (Jan 5, 2023)

BATOFF3 said:


> Single Monitor...dont have the space for anymore on My desk, plus I gave up using eyefinity back around 2009 when i ran 2 HD4870's in crossfire. lol



Nah not eyefinity, just a main 32" 165Hz panel and a 27" 165Hz vertical panel to the side. 3070Ti handles both at 20W but 2060S needed 45W, I'm bracing for the worst unless the promised Adrenalin multimonitor fix driver is already out

You undervolted? My ambient also around 22C, we can compare when the time comes. I know hotspot is kinda a wild shitshow on all cards


----------



## BATOFF3 (Jan 5, 2023)

no undervolt. Everything at stock. Hotspot did break over 60c in that timespy test.
Nothing to worry about in My case.


----------



## AsRock (Jan 5, 2023)

Vya Domus said:


> It is grim, about 1000 euro and the XTX was almost 1300. I don't think they'll drop, this is pretty much part of the course where I live, everything is about 20% over USD MSRP.
> 
> 
> That does not explain a gap of more than 30C between GPU temp and hotspot. If we're assuming something is wrong with the cooler then both temps should be significantly worse not just the hot spot.



Well the US prices were to go up too another 20%, but it's been set back i believe for another 9 months.


----------



## AusWolf (Jan 5, 2023)

BATOFF3 said:


> Hope this helps.


Thanks for that, but I was actually looking for a stress test result with a faulty card, something like this:











This is my 6750 XT at full stock settings, 85 °C max GPU temp and 100-105 °C hotspot. Since the videos talk about some kind of throttling (which is a nonexistent term on a modern AMD GPU where clock speeds always fluctuate, even in normal circumstances), I think the only way to prove or disprove a decrease in performance in relation to heat is by a stability test, which is surprisingly missing from every report.


----------



## Vya Domus (Jan 5, 2023)

tabascosauz said:


> How's the MBA cooler on your XT? No problems? It's a bit late for me to have second thoughts about it, but I can't find any word on whether the XT suffers from the same 110C problem, maybe due to how few people have the XT.
> 
> Vertical mount in my case, but I don't want the problem to come back and bite me in the ass once warranty is over, just because I didn't go out of my way to test it in horizontal orientation.



Max 85C hotspot while overclocked.


----------



## Theswweet (Jan 5, 2023)

I downloaded Ghostrunner, and fired it up on my 7900XTX using the FSR2.1 mod. Man, this thing rips at "DLSS" Quality, and looks slick as hell. 70+ FPS in CyberVoid sections and over 100 FPS consistently in regular stages completely maxed out @ 4K. RDNA3's raytracing might not be as good as nVidia, but it certainly can manage just fine enough compared to RDNA2...


----------



## kurosagi01 (Jan 5, 2023)

My 7900XT MBA junction temps have been fine so far, with my current undervolt setup i'm getting 60-65c while playing games.
The default setting i've been getting is 75c in the junction temp, which is around the ballpark from W1zz review. My room temperature at the moment is around 18-22c on a good day,since its cold in the UK at the moment and I don't have the heaters on.
I'll just link my photo of the card: https://www.techpowerup.com/forums/threads/whats-your-latest-tech-purchase.225885/post-4914562


----------



## tabascosauz (Jan 5, 2023)

kurosagi01 said:


> My 7900XT MBA junction temps have been fine so far, with my current undervolt setup i'm getting 60-65c while playing games.
> The default setting i've been getting is 75c in the junction temp, which is around the ballpark from W1zz review. My room temperature at the moment is around 18-22c on a good day,since its cold in the UK at the moment and I don't have the heaters on.
> I'll just link my photo of the card: https://www.techpowerup.com/forums/threads/whats-your-latest-tech-purchase.225885/post-4914562
> View attachment 277640View attachment 277641



Nice setup. How low can you go on vcore and still match stock clocks? I need to read up on some RDNA undervolting guides.

I like the fan stop. Was getting conflicting accounts of the MBA cooler (w1zz said fan stop, linus said no fan stop)


----------



## kurosagi01 (Jan 5, 2023)

I'm not sure whether the fan stop actually kicks in or not for me, since I don't really look and all the fans in general kind of overpower the GPU noise for me.
As for attempting to see how low I can undervolt the vcore with stock target boost clock, i've not really tried.
The way I've set the target core clock to 2760mhz,is based on the auto value entered when i've adjusted the power limit to 15%(the core clock value changes automatically changed to 2800mhz when I increased the power limit).
So I did 2400(default boost clock)*1.15(15% power limit increase), which gives 2760mhz, so i've set the target as that instead.
I have only gone by observation for the stability,on COD it was perfectly fine on 1050mv with 15% power limit and the auto core clock value, but on NFS Unbound it hard crashed my PC, so i've adjusted it to 1075mv and it has been rock solid so far.
You may be able to get much lower undervolt on the vcore if you are targeting the 2400mhz boost clock with 15% power limit increase, maybe even a 11-15c junction temp drop too with custom fan profile.
I know a lot of the tech tubers like to set fan speed at 100%, but nobody likes to listen to a jet plane all the time.


----------



## GamerGuy (Jan 5, 2023)

@kurosagi01 Are you using a no name DP cable? Or, is it a branded one?


----------



## kurosagi01 (Jan 5, 2023)

GamerGuy said:


> @kurosagi01 Are you using a no name DP cable? Or, is it a branded one?


I am using one that was provided with my IIyama monitor, which should be better than a "no-name/generic" brand from ebay or amazon.


----------



## GamerGuy (Jan 5, 2023)

kurosagi01 said:


> I am using one that was provided with my IIyama monitor, which should be better than a "no-name/generic" brand from ebay or amazon.


Ah, yep, Iiyama makes professional grade, color accurate monitors, right? Little wonder that they'd supplied you a good quality cable. I'm a firm believer in good quality DH/HDMI cables....I mean, IF you're willing to spend all that cash on high end cards, and great monitors, why cheap out on cables?


----------



## kapone32 (Jan 6, 2023)

Well after about 3 hours of messing Around I am ready to give my Opinion on the 7900XTX. There was a Song from back in the day called "Don't believe the Hype". What AMD has achieved is exactly what you thought they could if you owned a 6500XT. I don't even have mine running full bandwidth. I will say that in just about every Game so far my X3D is running at or above 4 GHZ at all times now. In TWWH3 the FPS never drops below 100, ever. The Easiest way to improve performance (if you have a 3X8 pin) is to turn up the Power slider in Adrenline Software. Using Auto OC I see 3.1 GHZ on the Core clock. I may have yo update HWinfo64 to get some more variables to show up. 







kapone32 said:


> Well after about 3 hours of messing Around I am ready to give my Opinion on the 7900XTX. There was a Song from back in the day called "Don't believe the Hype". What AMD has achieved is exactly what you thought they could if you owned a 6500XT. I don't even have mine running full bandwidth. I will say that in just about every Game so far my X3D is running at or above 4 GHZ at all times now. In TWWH3 the FPS never drops below 100, ever. The Easiest way to improve performance (if you have a 3X8 pin) is to turn up the Power slider in Adrenline Software. Using Auto OC I see 3.1 GHZ on the Core clock. I may have yo update HWinfo64 to get some more variables to show up.


This is after a 1 hr battle in TWWH3 where my 30 units at 100 minimum were fighting 4 stacks of Dwarves and Kislev combined



kapone32 said:


> Well after about 3 hours of messing Around I am ready to give my Opinion on the 7900XTX. There was a Song from back in the day called "Don't believe the Hype". What AMD has achieved is exactly what you thought they could if you owned a 6500XT. I don't even have mine running full bandwidth. I will say that in just about every Game so far my X3D is running at or above 4 GHZ at all times now. In TWWH3 the FPS never drops below 100, ever. The Easiest way to improve performance (if you have a 3X8 pin) is to turn up the Power slider in Adrenline Software. Using Auto OC I see 3.1 GHZ on the Core clock. I may have yo update HWinfo64 to get some more variables to show up.
> 
> 
> 
> ...


Now here are my thoughts on the issues. As to the 110 hotspot I really feel it is the thermal pads on the memory. Even my card is still 15 degrees higher on the Memory Junction temp even 10-15 minutes after pushing the card. Before I updated HWinfo64 I was just getting the GPU and Hot Spot temp but since Updating I am noticing that my Memory Junction Temp is higher than my Hot spot Temp by 3 Degrees. When I get my block I will let you guys know how adding water changes the behaviour of these cards but it might just be a case of improper materials used but is is PC so we can't really say. 

The hot cord issue. I took the first DP cable I ever owned which I believe is DP 1.2. After 15 minutes of running 3D Mark I noticed that the cable was very warm to to the touch. Then I used my DP 1.4 cable and that issue went away. These cards are fast so fast that your poor DP cable better be High bandwidth because they are that fast. 

Vs 6000. It still feels a little raw but there is no doubt that the 7900XTX is enough faster than the 6800XT that I have my Watercooled 6800XT sitting in the Closet while my loop runs through itself. I have a block on the way but the performance is that Good. 200+ FPS in Division 2 with everything turned on at 1440P is nice but as I said before, seeing my 5800X3D being pushed more is crazy to think about. Even the picture from the card somehow looks a little bit crisper on the Desktop (I can see more details in the Webb Carina Nebula background). Maybe I will try the Uranus one next. 

If the narrative and noise has you on the fence do not stay on, jump in and enjoy TPU when AMD does things that annoys everyone and makes more than a few of us quite happy to have purchased this product. If you can, please get the 3X8 pin card only if you have a Powerful PSU as if you want that I would not start at less than 1000W with 3 PCIe on the PSU itself. All in all I can see distributors putting this card in price below the 4090 but above some 4080s but not 3090s or TIs. As the Ray Tracing (Haven't tested it yet CP2077 is tomorrow) performance becomes more appreciated (watching Videos) I can definitely say that the 7900XTX is a Great card......Just don't get a reference, but historically how long have Reference cards been part of the Ecosystem? I am not saying that all reference cards are compromised but you won't get a reference card with 3 8 pins.


----------



## tabascosauz (Jan 6, 2023)

AusWolf said:


> Thanks for that, but I was actually looking for a stress test result with a faulty card, something like this:
> 
> This is my 6750 XT at full stock settings, 85 °C max GPU temp and 100-105 °C hotspot. Since the videos talk about some kind of throttling (which is a nonexistent term on a modern AMD GPU where clock speeds always fluctuate, even in normal circumstances), I think the only way to prove or disprove a decrease in performance in relation to heat is by a stability test, which is surprisingly missing from every report.



Wonder how many people who've spent the past week deriding derbauer as a FUD spreader are going to eat their words   okay in all seriousness, it was as hypothesized, the lack of fluid in certain batches of cooler.

It is a minor performance penalty when throttling @ 110C, but it's not a placebo - yes he should have published clocks, but even in derbauer's own temp graphs it's obvious. Visibly rising temp then abruptly flatlining at 110C is not normal unthrottled behaviour for edge temp or hotspot temp on any GPU, period. Also 100% fan speed earrape

I'm just glad that [according to all sources so far] I won't have to worry about it in the XT MBA. ETA got moved up a week, good thing since I sold my TUF today and am now back to my old GPU.

Hopefully this drama is over and they can now focus on driver performance and addressing the multi monitor idle.

@ 3:10


----------



## QuietBob (Jan 6, 2023)

kapone32 said:


> Well after about 3 hours of messing Around I am ready to give my Opinion on the 7900XTX.


Thanks a lot for sharing your observations! I'm really interested in the power consumption figures. Your data shows a minimum TBP of 25w, while in TPU's testing idle power was 12-16w. Was something running in the background in your case, an open web browser maybe?

Also, could you check with YouTube at 720p to 4K? AMD claims to have improved video playback consumption in their recent drivers.


----------



## Zyll Goliat (Jan 6, 2023)

tabascosauz said:


> Wonder how many people who've spent the past week deriding derbauer as a FUD spreader are going to eat their words   okay in all seriousness, it was as hypothesized, the lack of fluid in certain batches of cooler.
> 
> It is a minor performance penalty when throttling @ 110C, but it's not a placebo - yes he should have published clocks, but even in derbauer's own temp graphs it's obvious. Visibly rising temp then abruptly flatlining at 110C is not normal unthrottled behaviour for edge temp or hotspot temp on any GPU, period. Also 100% fan speed earrape
> 
> ...


So it seems everything is fine folks AMD finally find the solution for their 7900 XTX Vapor Chamber Issue......




Ehh....Vapor chamber missing just a tiny bit of water.....nothing to worry about that's an Easy Fix


----------



## kapone32 (Jan 6, 2023)

QuietBob said:


> Thanks a lot for sharing your observations! I'm really interested in the power consumption figures. Your data shows a minimum TBP of 25w, while in TPU's testing idle power was 12-16w. Was something running in the background in your case, an open web browser maybe?
> 
> Also, could you check with YouTube at 720p to 4K? AMD claims to have improved video playback consumption in their recent drivers.


I always have Steam, GOG and Epic running. Of course there are the Windows background programs like Weather and other News media pop ups. I also have Skype running as well. 

Once I adjusted the power limits I do see that HWinfo says a max board draw of 530W. When Gaming I only see a max of 364 but when doing benchmarks spikes to 410 W can be seen. One issue is that the AMD stress test is broken and will push the card for 10 seconds (60 second test) and report idle for the next 50. I do have the overlay applied from AMD software and see a board power of 33 watts as we speak. 

I did try Youtube in Watching WEC 8 hrs of Bahrain at 1080P and a 4K Hardware Canucks CES MSI tour (No AMD laptops ). The power draw was around 65 watts.. I do believe that the card has great potential for OC with a sweet Undervolt but you need to tame the card. Maybe that is why the new blocks look like square reservoirs instead of channelled acrylic and the heatsinks are so huge on AIB cards. The fact that there is 1 driver is good in my opinion. It means all the work that Sony, Microsoft and AMD are doing on hardware and software enhancements for said hardware will be realized for this card and it stands to reason that the next Console will be AM5 with RDNA whatever.


----------



## kurosagi01 (Saturday at 11:21 AM)

Thought i'll share my hwinfo stats after just about an hour of COD MW2 with my undervolt settings.
The max recorded temps seems perfectly up to spec? I can't find any information what the spec is for the temperature range.


----------



## phanbuey (Saturday at 11:52 AM)

kurosagi01 said:


> Thought i'll share my hwinfo stats after just about an hour of COD MW2 with my undervolt settings.
> The max recorded temps seems perfectly up to spec? I can't find any information what the spec is for the temperature range.
> View attachment 278001


that's downright cold.  The spec is up to 110C before throttle.


----------



## kurosagi01 (Saturday at 12:31 PM)

Testing out target boost clock at 2400mhz at the moment now, using NFS Unbound since that game seems to like crashing the PC with any of my undervolt settings.
So it crashed after 2 races at 1050mv(Seems like this is my cap), I've increased the voltage to 1055mv which seems Ok after 2 races but I wasn't too happy about the frame rate as I was getting 80-90fps while racing.
I've increased it to 1060mv and I was getting 100+fps on average while racing.
I'm using the same fan speed and power limit at 15% like I am with my 1075mv config at 2760mhz.


----------



## AusWolf (Saturday at 12:34 PM)

tabascosauz said:


> Wonder how many people who've spent the past week deriding derbauer as a FUD spreader are going to eat their words   okay in all seriousness, it was as hypothesized, the lack of fluid in certain batches of cooler.
> 
> It is a minor performance penalty when throttling @ 110C, but it's not a placebo - yes he should have published clocks, but even in derbauer's own temp graphs it's obvious. Visibly rising temp then abruptly flatlining at 110C is not normal unthrottled behaviour for edge temp or hotspot temp on any GPU, period. Also 100% fan speed earrape
> 
> ...


Like I said, I don't want to fight any battle on any side... I just want to know how much performance is lost due to these thermal issues.

I know the internet is full of trolls, but is it so hard to believe that some of us just genuinely want to learn? 

Der8auer's video didn't give me the answers I was looking for, that's why I asked here.


----------



## HTC (Saturday at 12:56 PM)

Stupid question: have any of the proud new owners of XTX cards tried changing the cable to see if there are any differences in your temps, as described in the video posted in the 1st page? I'm referring to ANY such card: even those that DON'T seem to have this issue.

Who knows: maybe swapping the cable MIGHT result in lower hot spot temps AND a small boost in performance, as opposed to a HUGE DROP in those hot spot temps and zero throttling.

Just thought of another stupid question: does swapping the cable have ANY EFFECT @ ALL in idle power consumption? Just wondering.


----------



## jesdals (Saturday at 1:00 PM)

Any lucky 7900 XTX owners that use Eyefinity setups? I would like to hear about experience with 3x displayport on the reference with a dongle/usb c combo? And how about gaming in my favorit resolution 7680*1440 is it great or dosnt the 24gb memory scale well with the multi monitor performance?


----------



## kapone32 (Saturday at 3:26 PM)

HTC said:


> Stupid question: have any of the proud new owners of XTX cards tried changing the cable to see if there are any differences in your temps, as described in the video posted in the 1st page? I'm referring to ANY such card: even those that DON'T seem to have this issue.
> 
> Who knows: maybe swapping the cable MIGHT result in lower hot spot temps AND a small boost in performance, as opposed to a HUGE DROP in those hot spot temps and zero throttling.
> 
> Just thought of another stupid question: does swapping the cable have ANY EFFECT @ ALL in idle power consumption? Just wondering.


I tried a DP 1.2 cable and while I did not notice an increase in GPU temps that cable felt really warm after 3 runs of 3D Mark Timespy. When I changed it to a DP 1.4 cable the issue went away. I only used it for 15 minutes though so I can't say definitively that the cable is the cause for the Hotspot temps. As I have said before a quirk that I have found is my Memory is higher than my Hotspot.



kurosagi01 said:


> Testing out target boost clock at 2400mhz at the moment now, using NFS Unbound since that game seems to like crashing the PC with any of my undervolt settings.
> So it crashed after 2 races at 1050mv(Seems like this is my cap), I've increased the voltage to 1055mv which seems Ok after 2 races but I wasn't too happy about the frame rate as I was getting 80-90fps while racing.
> I've increased it to 1060mv and I was getting 100+fps on average while racing.
> I'm using the same fan speed and power limit at 15% like I am with my 1075mv config at 2760mhz.
> View attachment 278004


How is TDC limit 448 A when my 7900XTX is at 290? Can we get some more to compare?


----------



## kurosagi01 (Sunday at 12:56 PM)

kapone32 said:


> I tried a DP 1.2 cable and while I did not notice an increase in GPU temps that cable felt really warm after 3 runs of 3D Mark Timespy. When I changed it to a DP 1.4 cable the issue went away. I only used it for 15 minutes though so I can't say definitively that the cable is the cause for the Hotspot temps. As I have said before a quirk that I have found is my Memory is higher than my Hotspot.
> 
> 
> How is TDC limit 448 A when my 7900XTX is at 290? Can we get some more to compare?


I'm not sure why the TDC limit is 448A, even on the default config it's reporting 448A on Hwinfo.


----------



## kapone32 (Sunday at 1:08 PM)

kurosagi01 said:


> I'm not sure why the TDC limit is 448A, even on the default config it's reporting 448A on Hwinfo.


That seems very excessive but your card seems to be working great though. How are you enjoying yours otherwise though?


----------



## kurosagi01 (Sunday at 1:14 PM)

kapone32 said:


> That seems very excessive but your card seems to be working great though. How are you enjoying yours otherwise though?


Getting a lot higher frames than I was with my 3080, that is without DLSS as well which is great. The fan noise of the reference cooler isn't that bad either with my current undervolt, can barely hear it over the other fans installed.
Some games still need optimising though since its a new card and drivers. NFS Unbound is still a bit choppy every now and then, which I think Criterion needs to patch the game as well.
Can't really complain when I have 2 high-end rigs in the house now, which my partner will be able to enjoy on her desk setup, but for now its the 4K gaming machine in the living room.


----------



## kapone32 (Sunday at 1:25 PM)

kurosagi01 said:


> Getting a lot higher frames than I was with my 3080, that is without DLSS as well which is great. The fan noise of the reference cooler isn't that bad either with my current undervolt, can barely hear it over the other fans installed.
> Some games still need optimising though since its a new card and drivers. NFS Unbound is still a bit choppy every now and then, which I think Criterion needs to patch the game as well.
> Can't really complain when I have 2 high-end rigs in the house now, which my partner will be able to enjoy on her desk setup, but for now its the 4K gaming machine in the living room.


F me 4K. This card laughs at 4K. I have been running some benchmarks and Strange Brigade as an example was like 267 FPS. The Division 2 will sit at 170 FPS all day and TWWH3 is so butter smooth in battles it is addicting.  Redout does not give me much over the 6800XT and Just Cause 4 is higher but only by about 10 to 15 FPS but I agree wholeheartedly with your statement, let Drivers come.


----------



## kurosagi01 (Sunday at 1:44 PM)

kapone32 said:


> F me 4K. This card laughs at 4K. I have been running some benchmarks and Strange Brigade as an example was like 267 FPS. The Division 2 will sit at 170 FPS all day and TWWH3 is so butter smooth in battles it is addicting.  Redout does not give me much over the 6800XT and Just Cause 4 is higher but only by about 10 to 15 FPS but I agree wholeheartedly with your statement, let Drivers come.


I would like to try the 7900XT in 4K but I don't think i'll be upgrading to 4K monitor anytime soon.
Neither do I fancy moving my PC downstairs,while hers is being used as the 4K machine haha. 
I've just swapped my Iiyama with an AOC monitor as the Iiyama is going to be sent for warranty work, feels good to be back in curved VA panel with 1000R and same resolution but with 165hz range.


----------



## MrPerforations (Monday at 8:35 PM)

Hello Club,
                i got me the AMD™ Sapphire™ RX7900XTX Nitro+ card to replace my AMD™ RX Vega 56 Crossfire system.
no been to impressed with the benchmarks i have run, but i found i didn't have any of the 4k benchmarks saved for the AMD™ RX Vega 56 Crossfire.
i just did the superposition 1080p extreme bench for the thread.
i got 15000 for the first run, then i noticed resizable bar was disabled and went and enabled that as well as turned on POB in bios, i gain a 17000 score.
it benched on par with a Nvidia™ 4080, which is right.
i have a water block coming tomorrow, the Alphacool™ block.
i will let you know about the temps and then i can play some OC.


----------



## AusWolf (Monday at 9:00 PM)

MrPerforations said:


> Hello Club,
> i got me the AMD™ Sapphire™ RX7900XTX Nitro+ card to replace my AMD™ RX Vega 56 Crossfire system.
> no been to impressed with the benchmarks i have run, but i found i didn't have any of the 4k benchmarks saved for the AMD™ RX Vega 56 Crossfire.
> i just did the superposition 1080p extreme bench for the thread.
> ...


Congrats! 

But why did you buy a Nitro+ if you were planning on watercooling it? You could have saved a few bucks with an MBA card.


----------



## MrPerforations (Monday at 9:17 PM)

thanks,
the 3x8 pins board, and last time i got the best one of all the card of that type, it was a good idea as it worked out to be the best performance and cheap. at the time it was better than a 2080 and half the price.
also, im not sure if i want to keep this atm.
my regret i have is not waiting for the ASRock™ cards to come in stock or the MSI™ ones to be released, i bet they beat this one up.

been running the AMD™ adrenaline overlay to monitor the fps , ect, ect.
i noticed that my CPU is utilizing only about 5% at 4k , is that right please? 
the GPU is at 100%
gotta be able to save money if it is right for sure.


----------



## tabascosauz (Monday at 9:30 PM)

MrPerforations said:


> my regret i have is not waiting for the ASRock™ cards to come in stock or the MSI™ ones to be released, i bet they beat this one up.



What makes you say that though? So far AIB design Sapphire, XFX and ASRock (Taichi and Aqua) all look like they use almost-reference PCBs that are only slightly extended to accommodate 3 x 8-pin. The variance in OC headroom looks like it all comes from the GPU itself at this time.


----------



## kapone32 (Monday at 9:31 PM)

MrPerforations said:


> thanks,
> the 3x8 pins board, and last time i got the best one of all the card of that type, it was a good idea as it worked out to be the best performance and cheap. at the time it was better than a 2080 and half the price.
> also, im not sure if i want to keep this atm.
> my regret i have is not waiting for the ASRock™ cards to come in stock or the MSI™ ones to be released, i bet they beat this one up.


The Pulse is actually more expensive than the Nitro right now but you got basically the best AMD GPU vendor. Sapphire is older than EVGA and they only make GPUs. What you get is usually one of the best binned GPUs and super nice build quality. You will see when you go to take your card apart how much engineering has gone into it vs the competition.


----------



## MrPerforations (Monday at 9:50 PM)

been waiting for Buildzoids™ review of the boards.
i don't know electronics but i think the Sapphire™ has ceramics on it, but the ASRock™ has 22 phases vs 20 phase, but like i said i dont know.


----------



## kapone32 (Monday at 9:51 PM)

tabascosauz said:


> What makes you say that though? So far AIB design Sapphire, XFX and ASRock (Taichi and Aqua) all look like they use almost-reference PCBs that are only slightly extended to accommodate 3 x 8-pin. The variance in OC headroom looks like it all comes from the GPU itself at this time.


The 3x8 pin is the only reason I got an AIB card. I wanted to get the Aqua but my friend at the PC store told me that there were actually only 5 Aqua units allocated for Canada (his store).


----------



## MrPerforations (Monday at 9:56 PM)

yer, i planned on the ASRock™ RX7900XTX Taichi and a water block and make the same for less.
thus , i might return the Sapphire™ yet.


----------



## kapone32 (Monday at 10:04 PM)

MrPerforations said:


> yer, i planned on the ASRock™ RX7900XTX Taichi and a water block and make the same for less.
> thus , i might return the Sapphire™ yet.


Don't As Rock is the newest GPU vendor in the space and they make nice cards but the only real advantage they have is they only really compete with Gigabyte when it comes to pricing. The only card that is arguably better than your Nitro would bw the Toxic (Not released), Strix, TRIO OC and Aorus but if you are putting a block on that it becomes moot. Here is something you have on the Nitro that other cards don't have. Your card has an ARGB Header and a 4 Pin PWM header on the board. The ARGB is not what I mean. You will be able to connect your Rad fans to your GPU and still be able to use AMD software to adjust the fan curve. That is something to have vs 2 more VRM phases. The quality of Sapphire's components should also nto be ignored.


----------



## tabascosauz (Monday at 10:53 PM)

kapone32 said:


> The quality of Sapphire's components should also nto be ignored.



That is something I have always assumed too, but the only meaningfully unique PCB I've seen so far is the TUF. Everything else largely builds on the same reference design; even the TUF uses the exact same MPS parts in the VRM and same Vcore phase count. Sapphire in particular seems to be using the same cap supplier as MBA, but I guess that's not surprising since there were so many rumors that Sapphire manufactures AMD's reference boards. The only ASRock pictures I see for Taichi and Aqua are silhouettes, but the size of the PCB and layout of major components looks identical. 

Of course, maybe it's a vote of confidence in AMD's reference design. If all the AIBs feel confident enough in it even with 3 x 8-pin, that's a good sign considering the 450W the Nitro was pulling on what is essentially a reference PCB. 

Anyway, I agree, it's a bit of a stretch to expect ASRock or MSI to come up with anything radically better than the rest. Instead of worrying about PCB components it's a bit more concerning to me the variance in GPU silicon quality and voltage requirements (XTX and XT undervolting results on reddit seem to be wildly all over the place).

Nitro in the middle.














kurosagi01 said:


> I'm not sure whether the fan stop actually kicks in or not for me, since I don't really look and all the fans in general kind of overpower the GPU noise for me.
> As for attempting to see how low I can undervolt the vcore with stock target boost clock, i've not really tried.
> The way I've set the target core clock to 2760mhz,is based on the auto value entered when i've adjusted the power limit to 15%(the core clock value changes automatically changed to 2800mhz when I increased the power limit).
> So I did 2400(default boost clock)*1.15(15% power limit increase), which gives 2760mhz, so i've set the target as that instead.
> ...



Ok back to the fan stop issue as I can comment on it now.......it's frustrating as it's unnecessarily conservative. I still haven't managed to figure out just what the fan stop criteria is. It's clearly not GPU edge temp alone. 

Because of how boosty RDNA is and the video playback power consumption it's exceedingly difficult to get the fans to actually stop. The video power "bug" has been addressed as the review samples were pulling like 80W on Youtube, but 40-50W is not what I'd call stellar compared to ~20W on Ampere since it affects fan speed. It doesn't even heat up the GPU at all, but it keeps it barely above the fan stop threshold, just enough to spin the fans for no good reason and piss me off. 

I tried making a custom fan profile in Radeon Software but that tuning section is buggy as hell and seems to set whatever it wants after a reboot. 

I can have a browser open and still be at 20-25W and have fan stop.............as long as none of those tabs are Youtube. Even as a background tab, as long as the browser window itself is in focus, power consumption shoots up to 50W.

Having been away from Radeon for so long I'm not sure what to expect in terms AMD fixing this behaviour. From what I've read, it is probably a bad idea to have any expectations at all.


----------



## AusWolf (Tuesday at 6:38 AM)

tabascosauz said:


> Ok back to the fan stop issue as I can comment on it now.......it's frustrating as it's unnecessarily conservative. I still haven't managed to figure out just what the fan stop criteria is. It's clearly not GPU edge temp alone.
> 
> Because of how boosty RDNA is and the video playback power consumption it's exceedingly difficult to get the fans to actually stop. The video power "bug" has been addressed as the review samples were pulling like 80W on Youtube, but 40-50W is not what I'd call stellar compared to ~20W on Ampere since it affects fan speed. It doesn't even heat up the GPU at all, but it keeps it barely above the fan stop threshold, just enough to spin the fans for no good reason and piss me off.
> 
> ...


If your card is anything like the RDNA 1/2 cards that I've had, then the fan stop works based on GPU temperature. The cutoff point is usually around 50 °C which I suppose, can easily be reached with a multi-monitor setup watching Youtube with the power issues RDNA 3 has with current drivers.


----------



## tabascosauz (Tuesday at 7:10 AM)

AusWolf said:


> If your card is anything like the RDNA 1/2 cards that I've had, then the fan stop works based on GPU temperature. The cutoff point is usually around 50 °C which I suppose, can easily be reached with a multi-monitor setup watching Youtube with the power issues RDNA 3 has with current drivers.



It's certainly not 50C. Looks like 40C, it might be after 60 seconds spent below 40C or something like that

Both CODs were okay but Genshin and Sniper Elite 5 are microstuttering like mad. Too short to be picked up by Latencymon but obvious to anyone with a pair of functioning eyes. Been hitting buttons in Radeon Software to see if they do anything, reinstall didn't do anything either. Full DDU'd before I started so not sure what more I can do aside from clean install and literally reinstall all my games

We needed the new drivers yesterday, AMD


----------



## Vya Domus (Tuesday at 7:46 AM)

tabascosauz said:


> Ok back to the fan stop issue as I can comment on it now.......it's frustrating as it's unnecessarily conservative. I still haven't managed to figure out just what the fan stop criteria is. It's clearly not GPU edge temp alone.



Reference models don't have fan stop as far as I know. At least in my case they are completely inaudible.


----------



## tabascosauz (Tuesday at 7:55 AM)

Vya Domus said:


> Reference models don't have fan stop as far as I know. At least in my case they are completely inaudible.



It does have the feature. The way it works is highly questionable though


----------



## Vya Domus (Tuesday at 8:01 AM)

tabascosauz said:


> It does have the feature. The way it works is highly questionable though



It could be, it's just that no review mentioned it so I am assuming it doesn't have it, as a matter of fact I can't find any info anywhere that it does have fan stop.


----------



## tabascosauz (Tuesday at 8:04 AM)

Vya Domus said:


> It could be, it's just that no review mentioned it so I am assuming it doesn't have it, as a matter of fact I can't find any info anywhere that it does have fan stop.



No, like, I literally have MBA and the fans stop if you give it enough time.........key word is time. Fans start spinning roughly the same time as AIB zero rpm implementations (ie. 50C), but it takes forever to stop again at sub-40C.

Maybe fans turn off when hotspot reaches 40C? Then don't come back on until 50C?

I guess w1zz waited long enough (who mentioned fan stop in review) and linus didn't (who said there is no fan stop).


----------



## Vya Domus (Tuesday at 8:19 AM)

tabascosauz said:


> No, like, I literally have MBA and the fans stop if you give it enough time......
> 
> ...key word is time, however. Fans start spinning roughly the same time as AIB zero rpm implementations (ie. 50C), but it takes forever to stop again at sub-40C.
> 
> I guess w1zz waited long enough (who mentioned fan stop in review) and linus didn't (who said there is no fan stop).



My guess is that the threshold is based on power and not on temperature, it could also be some other software that keeps causing spikes in usage and frequency. When I had the 1080 that would often remain stuck at 1000mhz or something like that when idle, I could never figure out why.


----------



## AusWolf (Tuesday at 8:24 AM)

tabascosauz said:


> No, like, I literally have MBA and the fans stop if you give it enough time.........key word is time. Fans start spinning roughly the same time as AIB zero rpm implementations (ie. 50C), but it takes forever to stop again at sub-40C.
> 
> Maybe fans turn off when hotspot reaches 40C? Then don't come back on until 50C?
> 
> ...


That's very strange. My reference 6750 XT turns its fans on and then off at 50 °C. Also, when I quit a game, GPU and hotspot temps equalize immediately.


----------



## tabascosauz (Tuesday at 8:45 AM)

AusWolf said:


> That's very strange. My reference 6750 XT turns its fans on and then off at 50 °C. Also, when I quit a game, GPU and hotspot temps equalize immediately.



MBA has no heatpipes, only a large vapor chamber with fins soldered to it. Maybe that's why it takes a while for temps to come back down. It might be relatively compact but even my 2060S FE was faster at cooling itself down than this card. 

Also, the fans aren't helping. The second the load disappears they come back down to 500rpm, but the GPU is still at 50C and the heatsink fins are still hot. I might need to reinstall 22.12.2 again to get the Full Install back so I can change that fan curve, this is fucking ridiculous.


----------



## kurosagi01 (Tuesday at 8:54 AM)

I've just had a quick look to check if the fan stop is "working" on my MBA 7900XT and the fans are not actually moving for me.
My office temperature is currently 18.2c and the PC has only been on for the past 20minutes, the only things i've done within the 20 minutes is watch youtube and browse the web.



To be fair,if I waited a bit longer, the Powercolor Hellhound 7900XT for £9.99 more would have been a slightly better deal than the MBA.


----------



## tabascosauz (Tuesday at 9:14 AM)

kurosagi01 said:


> I've just had a quick look to check if the fan stop is "working" on my MBA 7900XT and the fans are not actually moving for me.
> My office temperature is currently 18.2c and the PC has only been on for the past 20minutes, the only things i've done within the 20 minutes is watch youtube and browse the web.
> View attachment 278399
> To be fair,if I waited a bit longer, the Powercolor Hellhound 7900XT for £9.99 more would have been a slightly better deal than the MBA.



Sounds about right, at idle I settle eventually at about 36-38 with an ambient of 23.5C. 

AIB was not in the cards for me, those cards are too honking huge. I could have had a XFX Speedster for the exact same price, but I couldn't. 

Looks like the windowed VRR in 22H2 needs to be disabled for 60Hz limited Freesync to work properly and not be a stuttery shit show. Seems to solve most of the Genshin performance issues. Funnily enough G-sync Compatible is the opposite, and won't even work without Windowed VRR enabled.

Sniper Elite 5 just looks broken though. I've been playing every day on 3 different GPUs now, and V-sync straight up does not work on this card.


----------



## kurosagi01 (Tuesday at 9:19 AM)

tabascosauz said:


> Sounds about right, at idle I settle eventually at about 36-38 with an ambient of 23.5C.
> 
> AIB was not in the cards for me, those cards are too honking huge. I could have had a XFX Speedster for the exact same price, but I couldn't.
> 
> ...


The size wasn't that big of an issue for me, since i'm using a Define R4 case still, which can fit upto 430mm in length of GPU. I'm more concerned about the weight of the card than the size and they are getting heavier as each generation progress.
The Radeon drivers definitely still needs refinement still so hopefully things will improve this year.


----------



## AusWolf (Tuesday at 9:25 AM)

tabascosauz said:


> MBA has no heatpipes, only a large vapor chamber with fins soldered to it. Maybe that's why it takes a while for temps to come back down. It might be relatively compact but even my 2060S FE was faster at cooling itself down than this card.
> 
> Also, the fans aren't helping. The second the load disappears they come back down to 500rpm, but the GPU is still at 50C and the heatsink fins are still hot. I might need to reinstall 22.12.2 again to get the Full Install back so I can change that fan curve, this is fucking ridiculous.


Well, the 6700 XT has heatpipes, but the 6800 XT seems to have the same vapor chamber design as 7900 MBA cards. I'm not sure about the 6750 XT that I have, though. It has the same three fan appearance as the 6800 XT, but I can't see under the shroud and there's no disassembly video on it as far as I've checked (and I definitely won't vandalize this beauty). 

It has the same slow cooldown as your 7900 XT. I'm watching some Youtube videos right now and the card is sitting nicely at 50-52 °C with the fans off.

Interestingly, it seems to cool down slower after a low load (due to Vsync or an FPS limit). If it goes full throttle, 200 W GPU chip power, 85 °C GPU and 105 °C hotspot, the fans spin fast enough so that they dispel the heat quite quickly after I quit the game. If I have an FPS limit, though, and the card sits at 70-ish °C with the hotspot not exceeding 90, the fans spin so slowly that the card takes a good 4-5 minutes to cool down. But even then, the fans turn off at 50 °C, and the card keeps cooling towards 40-45 °C extremely slowly with the case airflow.


----------



## tabascosauz (Tuesday at 9:36 AM)

AusWolf said:


> Well, the 6700 XT has heatpipes, but the 6800 XT seems to have the same vapor chamber design as 7900 MBA cards. I'm not sure about the 6750 XT that I have, though. It has the same three fan appearance as the 6800 XT, but I can't see under the shroud and there's no disassembly video on it as far as I've checked (and I definitely won't vandalize this beauty).
> 
> It has the same slow cooldown as your 7900 XT. I'm watching some Youtube videos right now and the card is sitting nicely at 50-52 °C with the fans off.
> 
> Interestingly, it seems to cool down slower after a low load (due to Vsync or an FPS limit). If it goes full throttle, 200 W GPU chip power, 85 °C GPU and 105 °C hotspot, the fans spin fast enough so that they dispel the heat quite quickly after I quit the game. If I have an FPS limit, though, and the card sits at 70-ish °C with the hotspot not exceeding 90, the fans spin so slowly that the card takes a good 4-5 minutes to cool down. But even then, the fans turn off at 50 °C, and the card keeps cooling towards 40-45 °C extremely slowly with the case airflow.



Completely as expected, Radeon Software is completely uninterested in saving my custom fan curve after a reboot   I even DDU'd again just to get it back

Never change, AMD, never change

Also, looks like w1zz was wrong about the USB-C, you can use it just like 20 FE and RX6000, no problems with hooking it up to 10Gbps devices. Not quite compensation for the headache today, but I'll take it


----------



## jesdals (Tuesday at 9:57 AM)

tabascosauz said:


> Completely as expected, Radeon Software is completely uninterested in saving my custom fan curve after a reboot  I even DDU'd again just to get it back


I had similar problems in windows 10 after a fresh windows 11 install problems was solved - but if your system crash it will reset the profile


----------



## tabascosauz (Tuesday at 10:20 AM)

jesdals said:


> I had similar problems in windows 10 after a fresh windows 11 install problems was solved - but if your system crash it will reset the profile



No sir, no crashing at all. Things pretty stable so far, just not performing up to par in user experience.

I might clean install this week if nothing changes, as I have time, but I really don't want to do it. 

If I find that nothing improves or is fixed after a clean install I may just deflate like a balloon for a while


----------



## Vya Domus (Tuesday at 11:17 AM)

kurosagi01 said:


> I've just had a quick look to check if the fan stop is "working" on my MBA 7900XT and the fans are not actually moving for me.
> My office temperature is currently 18.2c and the PC has only been on for the past 20minutes, the only things i've done within the 20 minutes is watch youtube and browse the web.



I think this might be vBIOS related, my card is Sapphire, I've never seen the fans turn off. I also have linux installed, I'll check if the fans ever turn off there, if not then it definitely has nothing to do with drivers.


----------



## GamerGuy (Tuesday at 12:31 PM)

My Nitro+ RX 6900 XT has this default 'Zero Fan' setting that always resets whenever I restart my PC. So annoying, but I can live with it as long as I disable 'Zero Fan' and set fans to my custom setting (usually, I'd save it as Fanprofile' in the Adrenalin control panel}. basically, the fan won't spin till a certain temperature threshold has been met.


----------



## tabascosauz (Tuesday at 7:50 PM)

Vya Domus said:


> I think this might be vBIOS related, my card is Sapphire, I've never seen the fans turn off. I also have linux installed, I'll check if the fans ever turn off there, if not then it definitely has nothing to do with drivers.



What's your room ambient? If edge temp is below 40 but hotspot temp does not at least touch 40 then I don't think it will be enough to be activate. And even right after booting, it still takes a few minutes for me.

It's so dumb, if they just made it based on edge temp <45C or <50C with an additional check for wattage load >50W, it would behave the same as any other sane card.

I think all the MBA cards are made by the same company, just like reference RX6000. Shouldn't be a difference.

I'm waiting for MPT to be updated for RDNA3, there is a fan curve tab in there that should be more permanent than POS Radeon Software


----------



## Vya Domus (Tuesday at 7:53 PM)

tabascosauz said:


> What's your room ambient? If edge temp is below 40 but hotspot temp does not at least touch 40 then I don't think it will be enough to be activate. And even right after booting, it still takes a few minutes for me.
> 
> I think all the MBA cards are made by the same company, just like reference RX6000. Shouldn't be a difference.
> 
> I'm waiting for MPT to be updated for RDNA3, there is a fan curve tab in there that should be more permanent than POS Radeon Software



Mine's pretty much always above 40C, I have like 26-27C room temp but I doubt 40C is the threshold, it seems way too low.


----------



## tabascosauz (Tuesday at 7:56 PM)

Vya Domus said:


> Mine's pretty much always above 40C, I have like 26-27C room temp but I doubt 40C is the threshold, it seems way too low.



 if my theory is correct 27C is too high to test it

That's exactly what I thought too, because the fan-on threshold is definitely higher.

But every single time the fan turns OFF, it has been <40C edge temp and =40C hotspot.

That's why it feels like a bug


----------



## Vya Domus (Tuesday at 8:02 PM)

tabascosauz said:


> That's why it feels like a bug



Maybe, I don't really care, my case fans are louder at idle than the card is at idle, so it makes no difference to me.


----------



## tabascosauz (Wednesday at 8:30 AM)

jesdals said:


> I had similar problems in windows 10 after a fresh windows 11 install problems was solved - but if your system crash it will reset the profile



There's the eureka! moment:

Second most powerful GPU in AMD's lineup is* LITERALLY INCAPABLE* of handling multi-monitor, at least on current drivers (22.12.2).

Turned off my side monitor (S2721DGF in vertical orientation), and 100% of the insufferable stuttering in Genshin goes away. Turn it on, instantly comes back. Turn it on, turn it off, I haven't seen a magic show like this since I was in grade school.

Before you ask, this was with the S2721DGF at 60Hz. Performs even worse than with the side monitor at 165Hz. 

I have:

DDU'd once (when I swapped GPUs), DDU'd twice (to get Radeon software back)
Clean installed Windows twice today
Reinstalled drivers (to try Minimal Install), reinstalled drivers again (to try Full Install), and reinstalled drivers again (finally to try Driver Only install).
I have tried playing with FRTC, disabling metrics/overlay, tried Radeon Chill on/off, set custom clocks/power under Tuning, tried Enhanced Sync on/off, fiddled with Freesync
...............and at the end of the day the only thing that worked was to disconnect the second monitor LMAO

I have one last trick up my sleeve - S2721DGF is a DP 1.4 monitor but I've been using a 1.2 cable. About to go to the store to pick up a VESA certified 1.4 cable. Don't expect it to make a difference.

Also had my first crash ever in Sniper Elite 5, after *an hour of 7900XT being outperformed by a 2060 Super at 1440p *C'mon AMD, this is just too good. The gift that keeps giving entertainment


----------



## AusWolf (Wednesday at 8:43 AM)

tabascosauz said:


> There's the eureka! moment:
> 
> Second most powerful GPU in AMD's lineup is* LITERALLY INCAPABLE* of handling multi-monitor, at least on current drivers (22.12.2).
> 
> ...


It's a shame you don't have an iGPU in your CPU. I always connect my secondary display to the iGPU's HDMI port (although it's a 7" touchscreen, and even my main monitor is just 1080p). I've been using this setup with CPUs and GPUs from every brand without issues for 6 years now.


----------



## tabascosauz (Wednesday at 8:51 AM)

AusWolf said:


> It's a shame you don't have an iGPU in your CPU. I always connect my secondary display to the iGPU's HDMI port (although it's a 7" touchscreen, and even my main monitor is just 1080p). I've been using this setup with CPUs and GPUs from every brand without issues for 6 years now.



I used to do this all the time on my 4790K! In the blue camp the only time I couldn't do it was with the 1230v2 (Xeon E3 so duh). Always worked like a charm.

Not enough to make me buy into AM5 just for the iGPU, though. I can't swap in my 5700G because the Impact does not have iGPU output. 

If I'm forced to upgrade due to AMD never fixing this (unlikely but who knows), it won't be to another AMD platform. I think I've paid my dues with 3 years of AM4 hardware and this 7900XT.

Sniper Elite 5 is also better now and zero stuttering, but still being outperformed by the 3070 Ti. This is on DX12 though, maybe Vulkan will be better


----------



## AusWolf (Wednesday at 9:12 AM)

tabascosauz said:


> I used to do this all the time on my 4790K! In the blue camp the only time I couldn't do it was with the 1230v2 (Xeon E3 so duh). Always worked like a charm.
> 
> Not enough to make me buy into AM5 just for the iGPU, though. I can't swap in my 5700G because the Impact does not have iGPU output.
> 
> ...


Maybe AMD is playing a "fine wine" game once again with RDNA 3. I can't believe the issue you described never came up during testing, unless their tests weren't thorough enough.


----------



## tabascosauz (Wednesday at 9:27 AM)

AusWolf said:


> Maybe AMD is playing a "fine wine" game once again with RDNA 3. I can't believe the issue you described never came up during testing, unless their tests weren't thorough enough.



Oh I'd bet on finewine happening. I'm just not sure it's worth all this. 6700XT and 6750XT pretty much finewine'd their way past the 3070 and 3070 Ti in a bunch of games, but I never cared about that - I just wanted to get away from 8GB of VRAM.

On a single monitor, with a clean Driver Only install, card is now clocking as it should in SE5 (270-310W at 2.5GHz+). Previously with two monitors it was stuck a stuttering mess at 180W at 1900MHz. Performance probably still has room to improve, it's only outperforming 3070 Ti native by a little bit.

FSR simply doesn't work, perf equal to or worse than native. Also Vsync does not work in Vulkan, whereas on 3070 Ti both work fine. I know it's a FSR 1.0 implementation but the image quality at Ultra FSR on the 3070 Ti was stunning - I couldn't believe it wasn't DLSS. By comparison Ultra FSR on the 7900XT is a blurry trash heap

Anecdotally, reddit has blamed everything under the sun on cheap uncertified DP cables. The vapor chamber problem, burning cables, stuttering, crashing, bad perf.........we shall see tomorrow.


----------



## AusWolf (Wednesday at 9:49 AM)

tabascosauz said:


> Oh I'd bet on finewine happening. I'm just not sure it's worth all this. 6700XT and 6750XT pretty much finewine'd their way past the 3070 and 3070 Ti in a bunch of games, but I never cared about that - I just wanted to get away from 8GB of VRAM.
> 
> On a single monitor, with a clean Driver Only install, card is now clocking as it should in SE5 (270-310W at 2.5GHz+). Previously with two monitors it was stuck a stuttering mess at 180W at 1900MHz. Performance probably still has room to improve, it's only outperforming 3070 Ti native by a little bit.
> 
> ...


It's weird that Nvidia has been selling revisions of the same architecture under different names for 3 generations now, while AMD has been doing the exact opposite - selling 3 different architectures under the same name. I just wish they got them right. After the buggy RDNA 1, we had the awesome RDNA 2, and now it's back to bugs again with RDNA 3, it seems. 

The weirdest thing is that we have people living on this planet who believe that progress is always linear, change is always positive and the ideal launch date of any new product is yesterday.


----------



## kapone32 (Wednesday at 2:12 PM)

AusWolf said:


> It's weird that Nvidia has been selling revisions of the same architecture under different names for 3 generations now, while AMD has been doing the exact opposite - selling 3 different architectures under the same name. I just wish they got them right. After the buggy RDNA 1, we had the awesome RDNA 2, and now it's back to bugs again with RDNA 3, it seems.
> 
> The weirdest thing is that we have people living on this planet who believe that progress is always linear, change is always positive and the ideal launch date of any new product is yesterday.


This is the first run of Chiplet based Gpus. This is very different than anything that has come before it. As they get there head around optimizing it we could see tremendous improvements in performance.


----------



## MrPerforations (Wednesday at 2:33 PM)

Alphacool™ water block is here, day and a half late, but its here.
it is very heavy, alarmingly so.


----------



## tabascosauz (Wednesday at 5:38 PM)

Anyone in Canada looking at the XT, here's the first new sub-$1200 deal I've seen pop up after last week's MBA and MERC at $1150:




Might only last about a day, knowing Amazon. Powercolor seems to be leading the charge on pricing


----------



## kapone32 (Wednesday at 6:06 PM)

MrPerforations said:


> Alphacool™ water block is here, day and a half late, but its here.
> it is very heavy, alarmingly so.


That means the block is quality. Tell me does the water area look like a pool or is it channelled?


----------



## MrPerforations (Wednesday at 8:35 PM)

micro fin center but more like a pool than a channel. didn't really look and now its in the machine.
bad news is the memory junction temp and the hot spot temps didn't change much, save about 5c in that area.
i assume it just that hot.
the package temp is down in the 41c.


----------



## tabascosauz (Wednesday at 10:31 PM)

AMD Software: Adrenalin Edition 23.1.1 for AMD Radeon™ RX 7900 Series Graphics Release Notes | AMD

New 23.1.1 driver, anyone on it yet? From the release notes:



> Known issues
> 
> High idle power has situationally been observed when using select high resolution and high refresh rate displays.
> Video stuttering or performance drop may be observed during gameplay plus video playback with some extended display configurations.



So, in the past 3 weeks, AMD fixed absolutely NOTHING  

I just tried switching the second monitor over to HDMI 2.0 and disabling Freesync, card still completely incapable of handling a second monitor on 22.12.2. Gonna try 23.1.1 now

The funny thing is that while I was gathering Genshin metrics today with CapframeX, multi monitor and single monitor repeatedly come out the exact same on frametimes. Frametimes and FPS, almost identical data. Yet, it's _blatantly_ obvious to anyone with a set of eyes that the multi monitor setup is CONSTANTLY stuttering. Maybe it's a Freesync problem that shit just hits the fan when there's a second monitor connected? As my vertical monitor, the S2721DGF already has its Freesync permanently disabled just to try and minimize any compatibility issues.


----------



## Super Firm Tofu (Wednesday at 10:36 PM)

tabascosauz said:


> So, in the past 3 weeks, AMD fixed absolutely NOTHING



You must be new here.  Still waiting on Adrenalin drivers to support AMD CPU monitoring of AM5 (since September FFS).


----------



## tabascosauz (Wednesday at 10:39 PM)

Super Firm Tofu said:


> You must be new here.  Still waiting on Adrenalin drivers to support AMD CPU monitoring of AM5 (since September FFS).



You reckon I should just return it and get a reference 6950XT? $100 less, not a great deal, but at least there are no surprises.


----------



## Super Firm Tofu (Wednesday at 10:43 PM)

tabascosauz said:


> You reckon I should just return it and get a reference 6950XT? $100 less, not a great deal, but at least there are no surprises.



Well, I had a Red Devil 7900 XT ready to be ordered until I read your feedback so far on the 7000 series.  Holding off for now.

As far as the 6950XT, if you think your power supply is up to the transients of RDNA2, it's probably a less frustrating option.

*edit* - I forgot that the 6950XT resolved the huge 20ms spikes that the 6900XT has.


----------



## AusWolf (Wednesday at 10:53 PM)

tabascosauz said:


> You reckon I should just return it and get a reference 6950XT? $100 less, not a great deal, but at least there are no surprises.


Last gen is always a safer option than the newest shiny and untested thing, imo. There are some sweet discounts on it too, unlike any new release.


----------



## tabascosauz (Wednesday at 10:56 PM)

AusWolf said:


> Last gen is always a safer option than the newest shiny and untested thing, imo. There are some sweet discounts on it too, unlike any new release.



I'm not sure about discounts. 6800, 6800XT, 6900XT have fully disappeared. 6950XT is all that's left and it's $1060, vs. the $1150 of the 7900XT.

I'm now testing the S2721DGF with Freesync on, at 60Hz, in landscape orientation. If Sniper Elite 5 doesn't shit the bed due to the S2721DGF's presence, I think I might be able to get around Genshin's problems by moving it to the Dell screen where it's 60Hz.

If.


----------



## AusWolf (Wednesday at 11:04 PM)

tabascosauz said:


> I'm not sure about discounts. 6800, 6800XT, 6900XT have fully disappeared. 6950XT is all that's left and it's $1060, vs. the $1150 of the 7900XT.


That's a shame. Anyway, my point stands: last gen is always a safer option. New gen should only be bought by people who don't mind beta testing or putting up with teething issues.


----------



## tabascosauz (Wednesday at 11:20 PM)

No change to the fan stop curve in 23.1.1. Not touching Radeon Software with a 10 foot pole anymore so I guess I'll live with the long wind down time. That shitty program crashed 3 times on me in the span of 1 hour yesterday. No bug report, no dialog box, no error logged, no nothing, just decided to leave.

Genshin works on the S2721DGF. Buttery smooth with just Windowed VRR ticked in Windows. Except the ridiculous 60Hz input lag, but I'll live. Still choppy af if I Win+Shift+arrowkeys the game onto the M32Q, I guess that's the beauty of purely software-based VRR 

And Sniper Elite still works the same as it did yesterday on one monitor. Maybe Radeon just isn't prepared to accept vertical monitors?

Back to the testing suite.



AusWolf said:


> That's a shame. Anyway, my point stands: last gen is always a safer option. New gen should only be bought by people who don't mind beta testing or putting up with teething issues.



Though, I wonder if that is actually reasonable to accept. Not qualified enough to comment on Radeon but at this point we should be paid for the investigative work that we do for the AGESA team. Outside of AMD launch day seems to still mean "the product is ready", not "congratulations, you ARE the engineering sample"


----------



## NicklasAPJ (Wednesday at 11:38 PM)

Just got my 7900 XTX Red DEVIL limited, and is a great card, buuut the OC side is a bit werid? need more TDP for this card, the card can "run" at 3100-3200Mhz core but will lose alot of fps instead of 2900/3000 cause of hitting TDP limit. 








Will make a video about it soon!?


----------



## kapone32 (Yesterday at 12:10 AM)

So I installed the latest driver and it did run the Stress test properly (for 10 seconds) which is better as last time it was 5 seconds. The idle power draw has also gone down and now the temp in Adrenline are about 3 degrees cooler but the Memory is still at 58 C in HWinfo.It is the only temp that is higher than 40C. I am really interested to see what adding a Waterblock will do to the memory temp.


----------



## tabascosauz (Yesterday at 2:21 AM)

kapone32 said:


> So I installed the latest driver and it did run the Stress test properly (for 10 seconds) which is better as last time it was 5 seconds. The idle power draw has also gone down and now the temp in Adrenline are about 3 degrees cooler but the Memory is still at 58 C in HWinfo.It is the only temp that is higher than 40C. I am really interested to see what adding a Waterblock will do to the memory temp.



I think there might still be room for power optimization on the new 20Gbps GDDR6 on these cards. At lowest possible idle power (15-20W), core is pretty much completely gated at 0.0W, yet memory still floats so many watts. Similar to chiplet Ryzen (cores parked) but also not (I doubt MCD are responsible for the power draw, have to double check the metrics).

At idle mine is also between 50-60C. No problems at load, around 82-88C memory, but I can imagine that the overclockers will need to get creative to cool this new generation of GDDR6 (just like GDDR6X). I saw someone mod the G6X on the TUF 3070 Ti/3080 with copper heatspreaders.

G6X on my last card ran a bit cooler (40ish idle, 80C load), but the cooler was also physically bigger, and density was half (1GB packages).


----------



## kapone32 (Yesterday at 3:03 AM)

tabascosauz said:


> I think there might still be room for power optimization on the new 20Gbps GDDR6 on these cards. At lowest possible idle power (15-20W), core is pretty much completely gated at 0.0W, yet memory still floats so many watts. Similar to chiplet Ryzen (cores parked) but also not (I doubt MCD are responsible for the power draw, have to double check the metrics).
> 
> At idle mine is also between 50-60C. No problems at load, around 82-88C memory, but I can imagine that the overclockers will need to get creative to cool this new generation of GDDR6 (just like GDDR6X). I saw someone mod the G6X on the TUF 3070 Ti/3080 with copper heatspreaders.
> 
> G6X on my last card ran a bit cooler (40ish idle, 80C load), but the cooler was also physically bigger, and density was half (1GB packages).


Agreed, I see the memory as being the culprit for the 110 C problem as well. I don't see the core but that top rank of 8 GB of Memory. could be the issue. As if the coolant is low having it horizontal showing those problems but not vertical speaks to some active part of the board Gaming. I wonder how  much system RAM these GPUs even use. This new driver may help to mitigate it by a few degrees.


----------



## tabascosauz (Yesterday at 3:13 AM)

kapone32 said:


> Agreed, I see the memory as being the culprit for the 110 C problem as well. I don't see the core but that top rank of 8 GB of Memory. could be the issue. As if the coolant is low having it horizontal showing those problems but not vertical speaks to some active part of the board Gaming. I wonder how  much system RAM these GPUs even use. This new driver may help to mitigate it by a few degrees.



23.1.1 doesn't change much in the way of anything. Dunno about Radeon Software since I don't have it.

The 110C issue is not quite the same issue. there are a variety of readings visible in a GPU Hotspots dropdown in HWInfo. If you upgraded from RX6000 you should reset HWInfo. GCD Hotspot, MCD Hotspot (1-6 for XTX), each VRM component, and GDDR6 all have separate hotspot readings.

I've been on the HWInfo beta, it has some Navi 31 improvements. Looks like the release has been merged with the beta today


----------



## kapone32 (Yesterday at 5:44 AM)

tabascosauz said:


> 23.1.1 doesn't change much in the way of anything. Dunno about Radeon Software since I don't have it.
> 
> The 110C issue is not quite the same issue. there are a variety of readings visible in a GPU Hotspots dropdown in HWInfo. If you upgraded from RX6000 you should reset HWInfo. GCD Hotspot, MCD Hotspot (1-6 for XTX), each VRM component, and GDDR6 all have separate hotspot readings.
> 
> I've been on the HWInfo beta, it has some Navi 31 improvements. Looks like the release has been merged with the beta today


I have the updated version and have commented before that i find that the Memory temp is higher than the Hot Spot Temp. I was using the PC with zero fan enabled and had 96 C on the Memory with a 2300 RPM Fan(s) speed.


----------



## AusWolf (Yesterday at 7:04 AM)

tabascosauz said:


> Though, I wonder if that is actually reasonable to accept. Not qualified enough to comment on Radeon but at this point we should be paid for the investigative work that we do for the AGESA team. Outside of AMD launch day seems to still mean "the product is ready", not "congratulations, you ARE the engineering sample"


I completely agree, though it has been happening all across the board. You can look at Hollywood and all the half-written rubbish they're dumping on us. Or the gaming industry and the scandals of half-finished games that get patched to an acceptable state at a later time. Every new generation of product or service you're buying nowadays would have been classed as "in beta state" 10 years ago. This is what happens when the whole Western world's business model has been set up to have X amount of time to develop a product that has to be X times better than the last and the scale never changes. It's all about meeting business targets and quotas - quality doesn't matter. It's disgusting, but this is the world we're consuming, unfortunately.


----------



## Space Lynx (Yesterday at 7:34 AM)

man this thread is really making me want to keep my 6800 XT now. I fixed the hotspot temp on my 6800 XT, it was a gpu sag issue since this is a giant ass card. measured and cut a wooden pencil to prop up the end of it, so there is no more sag, and my hotspot went from 99 celsius to 84 celsius in same tests in same config, etc. 

i don't think I want next gen anymore, I mean I would gain what 30 fps in several games, 40 in some, but my 6800 xt is oc'd like a  mofo and still runs good temps, getting 2520 core in games now, and gained 10 fps, so really i am looking at what 7900 xt maybe be 20 fps faster at 1440p... based on the reviews of games I play anyway. God of War I would still benefit from like a 40 fps gain, but that is an outlier in the benchmarks.


----------



## AusWolf (Yesterday at 7:38 AM)

Space Lynx said:


> man this thread is really making me want to keep my 6800 XT now. I fixed the hotspot temp on my 6800 XT, it was a gpu sag issue since this is a giant ass card. measured and cut a wooden pencil to prop up the end of it, so there is no more sag, and my hotspot went from 99 celsius to 84 celsius in same tests in same config, etc.
> 
> i don't think I want next gen anymore, I mean I would gain what 30 fps in several games, 40 in some, but my 6800 xt is oc'd like a  mofo and still runs good temps, getting 2520 core in games now, and gained 10 fps, so really i am looking at what 7900 xt maybe be 20 fps faster at 1440p... based on the reviews of games I play anyway. God of War I would still benefit from like a 40 fps gain, but that is an outlier in the benchmarks.


I agree. Upgrading every gen is a waste anyway (unless you just like to tinker with stuff).

There is this thread, though...


----------



## Space Lynx (Yesterday at 7:40 AM)

AusWolf said:


> I agree. Upgrading every gen is a waste anyway (unless you just like to tinker with stuff).
> 
> There is this thread, though...



Yeah, I already saw that thread, it doesn't worry me. My particular 6800 XT is a monster. 3 power plug design, so my guess is, w.e AMD may have fucked up in the drivers won't effect a card of this design which is designed for higher voltages out the door. I may be wrong, but I am not worried as I have been running latest drivers for over a month now with stress tests and gaming a lot, 0 issues.


----------



## AusWolf (Yesterday at 7:45 AM)

Space Lynx said:


> Yeah, I already saw that thread, it doesn't worry me. My particular 6800 XT is a monster. 3 power plug design, so my guess is, w.e AMD may have fucked up in the drivers won't effect a card of this design which is designed for higher voltages out the door. I may be wrong, but I am not worried as I have been running latest drivers for over a month now with stress tests and gaming a lot, 0 issues.


That's cool.  There's a few things suspicious about it anyway which you may pick up in the thread. Here's a couple:
1. Why is the GPU chip cracked? How does a driver do that?
2. Why only 6800 and 6900 cards?
3. What does the video's maker mean by "December 8 driver update"? Is it 22.11.2, or did the 7900 series' 22.12.1 driver install with these cards somehow? Both were issued on 8th Dec.
4. We don't know if the cards were overclocked, or if any OC tool was installed, etc.
And so on...


----------



## tabascosauz (Yesterday at 8:23 AM)

in case anyone's wondering, no, there is no fine wine to be had in 23.1.1

Stock with 2 monitors (don't mind the CPU score, typical anomaly):





Stock with 1 monitor:




Undervolt target 2750 @ 1050mV:


----------



## AusWolf (Yesterday at 8:37 AM)

tabascosauz said:


> in case anyone's wondering, no, there is no fine wine to be had in 23.1.1
> 
> Stock with 2 monitors (don't mind the CPU score, typical anomaly):
> View attachment 278722
> ...


Wait a second... in your 1 monitor setup, your *CPU* score is higher? How does that work?


----------



## kurosagi01 (Yesterday at 8:39 AM)

tabascosauz said:


> AMD Software: Adrenalin Edition 23.1.1 for AMD Radeon™ RX 7900 Series Graphics Release Notes | AMD
> 
> New 23.1.1 driver, anyone on it yet? From the release notes:
> 
> ...


I'll be sticking with the previous driver, unless the future drivers has a game that requires it.


NicklasAPJ said:


> Just got my 7900 XTX Red DEVIL limited, and is a great card, buuut the OC side is a bit werid? need more TDP for this card, the card can "run" at 3100-3200Mhz core but will lose alot of fps instead of 2900/3000 cause of hitting TDP limit.
> 
> 
> 
> ...


This is something Jay has pointed out with overclocking the 7900s, you can push the power limit, voltage and core clock. You can boost it but get less performance, but the memory clock drops dramatically.








For me the best thing to do is, leave the target clock speed, set the power limit to 15% and see what Adrenalin "auto" assign the clock speed and then manually set that clock speed and then undervolt until you can run it at a stable voltage.


----------



## tabascosauz (Yesterday at 8:55 AM)

AusWolf said:


> Wait a second... in your 1 monitor setup, your *CPU* score is higher? How does that work?



12800-12900 is my normal score at -25. CPU just decided to run a little slow that run. Happens.

Also, I'm still figuring out the full extent of it, but latest HWInfo is causing a fair bit of stuttering in games. I've stopped using it for now and just using the Radeon overlay


----------



## AusWolf (Yesterday at 9:02 AM)

tabascosauz said:


> 12800-12900 is my normal score at -25. CPU just decided to run a little slow that run. Happens.
> 
> Also, I'm still figuring out the full extent of it, but latest HWInfo is causing a fair bit of stuttering in games. I've stopped using it for now and just using the Radeon overlay


Yeah, but your GPU score is higher with 2 monitors (within margin of error, I'd say), so that can't be the cause of the difference in total score.


----------



## tabascosauz (Yesterday at 9:37 AM)

@kapone32 have you set a custom fan curve yet? These stock fan curves are batshit insane. Basically until my GPU hit 70C edge temp the fans wouldn't spin past 500-1000rpm. Which was also visually annoying because two of the fans wobble below 700rpm.

I kid you not, light games pulling about 100W ran almost as warm as 250-300W loads because the fans were stuck at 500rpm.

I set a new curve that has the fans start around 800rpm at the fan stop threshold, continuing gradually up to about 72% PWM at 70C, and afterwards a straight ramp to 100% (which I will basically never hit when undervolted). Dunno about the fans on your Pulse but the reference fans are pleasantly quiet up until about I'd say 1500rpm.



AusWolf said:


> Yeah, but your GPU score is higher with 2 monitors (within margin of error, I'd say), so that can't be the cause of the difference in total score.



It's only a 200 point difference total, pretty safe to say the CPU score is responsible for that one. Doesn't matter anyway because the undervolt score is what I'm after. To get past 13k CPU score all I have to do is dial my CO back up to the usual -28; I was back down at -25 just for troubleshooting.

I know the recommendation right now is +15% power limit and leave the clock target, but I'm not really interested in running XTX power (355W) through the XT reference cooler. It handles it fine but it's kinda antithetical to what I'm after with undervolting.

Here's a stock power limit run (308W), at 1050mV and runs a few degrees cooler while scoring ever so slightly better than stock. Gonna try keep reducing volts, but if this is a stable profile then I think I'll just leave it. And wait patiently for the finewine LOL





So far Genshin is happy, MW19 is happy, MW2 is happy (about 5C lower), Sniper Elite 5 is happy, next up are Arma and DCS.


----------



## kapone32 (Yesterday at 11:48 AM)

tabascosauz said:


> @kapone32 have you set a custom fan curve yet? These stock fan curves are batshit insane. Basically until my GPU hit 70C edge temp the fans wouldn't spin past 500-1000rpm. Which was also visually annoying because two of the fans wobble below 700rpm.
> 
> I kid you not, light games pulling about 100W ran almost as warm as 250-300W loads because the fans were stuck at 500rpm.
> 
> ...


Yes I did set a fan profile but I would have to re do it every time I restarted or had a Windows Update. Since the latest driver that seems to be working fine though. The Memory goes to a maximum of 86 C so that is better and I see the highest RPM at 3264

Yeah my block is out for Delivery!! Daughter is at school and Wife is at work so no distractions. I can't wait to get a look at the bare PCB.


----------



## MrPerforations (Yesterday at 3:01 PM)

hello Club,
   im having no luck getting anything out of the AMD™ Sapphire™ Radeon RX7900XTX nitro+.
i am having lots of no display output problems.
are you having the same issue please?
using display port 1.4 lead to cheapest 4k monitor (28"@ £180)

thought of a plan for next time i do a thermal paste job.
going to put the thermal paste in a plastic bag, dump it in some hot water and then the thermal paste will have a lower viscosity, using less, making it easier to spread and and thinning out the layer.
might want to try it Kapone32.


----------



## GamerGuy (Yesterday at 3:36 PM)

I'd suggest leaving the card untouched first, get it running first before messing around with repasting and such. I assume the driver was a clean install in new OS, or that you've done your due diligence by running DDU to remove all previous driver IF the card's going into a pre-existing system.


----------



## MrPerforations (Yesterday at 3:59 PM)

i water blocked it yesterday and only had the idea today.


----------



## kapone32 (Yesterday at 4:58 PM)

MrPerforations said:


> i water blocked it yesterday and only had the idea today.


So since you put the block on you have no Display?


----------



## tabascosauz (Yesterday at 7:52 PM)

kapone32 said:


> Yes I did set a fan profile but I would have to re do it every time I restarted or had a Windows Update. Since the latest driver that seems to be working fine though. The Memory goes to a maximum of 86 C so that is better and I see the highest RPM at 3264
> 
> Yeah my block is out for Delivery!! Daughter is at school and Wife is at work so no distractions. I can't wait to get a look at the bare PCB.



3200rpm?? holy moly

I didn't notice any temp improvements from the driver alone, but my memory temp was down at 70C on the dot with the core undervolt profile in MW2. Granted, it was only running about 250W in that game, but I don't think an extra 58W is enough to negate a 12-20C decrease in memory junction. So it might be worth chopping off some volts, if you haven't already.

I wish Radeon software could read mem temps so I wouldn't always have to fire up hwinfo and introduce stutters just to see it



MrPerforations said:


> thought of a plan for next time i do a thermal paste job.
> going to put the thermal paste in a plastic bag, dump it in some hot water and then the thermal paste will have a lower viscosity, using less, making it easier to spread and and thinning out the layer.
> might want to try it Kapone32.



In order for that to work water's gotta be quite warm, long soak, and you need to immediately apply after removing it from the water.


----------



## kapone32 (Yesterday at 8:14 PM)

tabascosauz said:


> 3200rpm?? holy moly
> 
> I didn't notice any temp improvements from the driver alone, but my memory temp was down at 70C on the dot with the core undervolt profile in MW2. Granted, it was only running about 250W in that game, but I don't think an extra 58W is enough to negate a 12-20C decrease in memory junction. So it might be worth chopping off some volts, if you haven't already.
> 
> I wish Radeon software could read mem temps so I wouldn't always have to fire up hwinfo and introduce stutters just to see it


Well I just got an email from UPS that my block is almost here. I am going to see if Watercooling is the way to go. No more fan profiles and no more memory temp issues (hopefully). That might be the solution. We will see


----------



## tabascosauz (Yesterday at 10:24 PM)

"Nvidia drivers are just as bad as AMD drivers" 

I'm not sure what exactly this is yet, because the first and second fields in Latencymon don't show the offending process name. All I know is that 20ms stutter makes games borderline unplayable.


----------

