• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD RX 7000 series GPU Owners' Club

Joined
Jan 14, 2019
Messages
12,586 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
My vote is for the 7900 XTX. It's a monster of a card that i'm very happy with.
Even if the rumours about RDNA 4 offering 7900 XTX-like performance for a much lower price and power consumption are true?

Thanks for answering, by the way. :)
 
D

Deleted member 237813

Guest
Ive managed to settle into the 2600mhz region but its not fully stable yet. Vcore is at 1025.
no way this is stable in very demanding games. avatar melts you core clocks and will show instabilitys very well.

there are games where 950 would be stable but avatar needs 1070mv for 2600. But since its only an offset it will only cause core clocks anyway. rdna 3 can only save power via core clock reduction and powerlimit.

i set it with chill per game to game basis. its awesome simply set profile for a game. rdr 2 i only use 2200mhz since its at 70-80fps with chill lock to 80 fps as i dont need more its at 240-250 watts native 4k :).

Even if the rumours about RDNA 4 offering 7900 XTX-like performance for a much lower price and power consumption are true?

Thanks for answering, by the way. :)
that should be the case nonetheless if a new generation releases. we dont need rumors for that. the rumor is no high end sku. but getting the performance from last gen for less money is normal..
 
Joined
Nov 18, 2010
Messages
7,596 (1.48/day)
Location
Rīga, Latvia
System Name HELLSTAR
Processor AMD RYZEN 9 5950X
Motherboard ASUS Strix X570-E
Cooling 2x 360 + 280 rads. 3x Gentle Typhoons, 3x Phanteks T30, 2x TT T140 . EK-Quantum Momentum Monoblock.
Memory 4x8GB G.SKILL Trident Z RGB F4-4133C19D-16GTZR 14-16-12-30-44
Video Card(s) Sapphire Pulse RX 7900XTX. Water block. Crossflashed.
Storage Optane 900P[Fedora] + WD BLACK SN850X 4TB + 750 EVO 500GB + 1TB 980PRO+SN560 1TB(W11)
Display(s) Philips PHL BDM3270 + Acer XV242Y
Case Lian Li O11 Dynamic EVO
Audio Device(s) SMSL RAW-MDA1 DAC
Power Supply Fractal Design Newton R3 1000W
Mouse Razer Basilisk
Keyboard Razer BlackWidow V3 - Yellow Switch
Software FEDORA 41
Even if the rumours about RDNA 4 offering 7900 XTX-like performance for a much lower price and power consumption are true?

Thanks for answering, by the way. :)

It came from MLID... it is a hoax at best. Front page writers had a slow news day and didn't have anything better to write about... yellow press or its like in local news firefighters took down some cat from a tree.
 
Joined
Jan 14, 2019
Messages
12,586 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
It came from MLID... it is a hoax at best. Front page writers had a slow news day and didn't have anything better to write about... yellow press or its like in local news firefighters took down some cat from a tree.
Good point.

So how is the 7900 XTX on idle power consumption? The last I checked, it was terrible, but back then, the 7800 XT wasn't good either, and since then, it got lowered by quite a lot by the newest drivers.
 
D

Deleted member 237813

Guest
lower than nvidia if we talk real idle. also depends on the model of course.
 
Joined
Sep 3, 2019
Messages
3,582 (1.85/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 200W PPT limit, 80C temp limit, CO -6-14, +50MHz (up to 5.0GHz)
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F39b, AGESA V2 1.2.0.C
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off-center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3667MT/s 1.42V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~467W (382W current) PowerLimit, 1060mV, Adrenalin v24.12.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR400/1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, ATX v2.4, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v24H2, OSBuild 26100.2605), upgraded from Win10 to Win11 on Jan 2024
Good point.

So how is the 7900 XTX on idle power consumption? The last I checked, it was terrible, but back then, the 7800 XT wasn't good either, and since then, it got lowered by quite a lot by the newest drivers.
I could believe RDNA4 at 7900XT+ level at 600-700 US MSRP but 7900XTX at 7800XT (550?) price is too far fetched...

As for (TBP) power the Nitro+ XTX with latest official 24.1.1 is hovering between <10W (rarely I've seen 5W min) and < 30W depending whats on the screen and if you move pointer, scrolling, task bar showing/hiding and so.
When something is moving like scrolling a page power shoots briefly to 45+W.
On screen saver goes around 45-50W and with monitor off <20W (I've seen avg about 15W)
Whatching movies (full screen or not) depends on where and what it is. From 45 to 65 maybe.

This is from last 20min while loading a few vids, typing here but no true idling, like nothing moves, and no screen saver.
True idling in my case is 10~25W. So avg should be like monitor off.

1707048154809.png
 
D

Deleted member 237813

Guest
it will depend on how nvidia will price their cards and how fast they are. if a 5070 is 7900xtx for 699. Amd has no reason to give it for 599 away with more vramand performance as usual.
 
Joined
Nov 18, 2010
Messages
7,596 (1.48/day)
Location
Rīga, Latvia
System Name HELLSTAR
Processor AMD RYZEN 9 5950X
Motherboard ASUS Strix X570-E
Cooling 2x 360 + 280 rads. 3x Gentle Typhoons, 3x Phanteks T30, 2x TT T140 . EK-Quantum Momentum Monoblock.
Memory 4x8GB G.SKILL Trident Z RGB F4-4133C19D-16GTZR 14-16-12-30-44
Video Card(s) Sapphire Pulse RX 7900XTX. Water block. Crossflashed.
Storage Optane 900P[Fedora] + WD BLACK SN850X 4TB + 750 EVO 500GB + 1TB 980PRO+SN560 1TB(W11)
Display(s) Philips PHL BDM3270 + Acer XV242Y
Case Lian Li O11 Dynamic EVO
Audio Device(s) SMSL RAW-MDA1 DAC
Power Supply Fractal Design Newton R3 1000W
Mouse Razer Basilisk
Keyboard Razer BlackWidow V3 - Yellow Switch
Software FEDORA 41
Good point.

So how is the 7900 XTX on idle power consumption? The last I checked, it was terrible, but back then, the 7800 XT wasn't good either, and since then, it got lowered by quite a lot by the newest drivers.

I had an argument with W1z about it, as I don't believe his dedicated device setup measurements are very accurate here. The point is, whatever what - look at frequencies. If your core clocks and RAM clocks sleep at their lowest while you do casual stuff, everything is fixed and fine, it is very HW setup dependent. And latest drivers drivers have fixed it even more, you are right. W1z also doesn't use latest drivers that did more fixes in recent super review tables too, I understand it is a pain to re-benches for basically usual nothing, so that's not a complaint, it is something for us to understand and take into account. Just look at your particular setup.

I am living on RADV and Linux lately, I dual boot into windows only for the darn Lightroom. In Linux using dual monitors setup everything hovers as low it can for my setup. Vram 96MHz and core at 4-50MHz.(5-20W peaks, avg 7W). There are OC bugs for 79xx series that are still being worked on in kernel 6.8. It is connected with the DRM firmware value bits, the commands to alter values have changed and everyone is confused a bit. But using at stock everything works fine. Gaming wise Steam has done so much, for those titles I play currently I don't see problems. I am not fond of the RT IQ we are presented or the PR machinery telling us that upsacaling is better than native... yeah sure... just disable built in crap TAA. So upsacling is used only for rare emulation cases for me. The vaseline like IQ in recent years amuses me. I rather play jaggy no AA, but sharp. People say DLSS/DLAA does wonders, it is better than native, sure it is better if a smarter sharpening algo is cleaning after the current AA/Shader bullshit and created a problem that should not be there in the first place. RT wise the path tracing is better, sure... but at what price? 2K$ to play at ~30-60FPS with a 4090? No thanks.
 
Joined
Jan 14, 2019
Messages
12,586 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
I could believe RDNA4 at 7900XT+ level at 600-700 US MSRP but 7900XTX at 7800XT (550?) price is too far fetched...

As for (TBP) power the Nitro+ XTX with latest official 24.1.1 is hovering between <10W (rarely I've seen 5W min) and < 30W depending whats on the screen and if you move pointer, scrolling, task bar showing/hiding and so.
When something is moving like scrolling a page power shoots briefly to 45+W.
On screen saver goes around 45-50W and with monitor off <20W (I've seen avg about 15W)
Whatching movies (full screen or not) depends on where and what it is. From 45 to 65 maybe.

This is from last 20min while loading a few vids, typing here but no true idling, like nothing moves, and no screen saver.
True idling in my case is 10~25W. So avg should be like monitor off.

View attachment 332994
That looks very much like my 7800 XT at idle / low load. Thanks. :)

I had an argument with W1z about it, as I don't believe his dedicated device setup measurements are very accurate here. The point is, whatever what - look at frequencies. If your core clocks and RAM clocks sleep at their lowest while you do casual stuff, everything is fixed and fine, it is very HW setup dependent. And latest drivers drivers have fixed it even more, you are right. W1z also doesn't use latest drivers that did more fixes in recent super review tables too, I understand it is a pain to re-benches for basically usual nothing, so that's not a complaint, it is something for us to understand and take into account. Just look at your particular setup.

I am living on RADV and Linux lately, I dual boot into windows only for the darn Lightroom. In Linux using dual monitors setup everything hovers as low it can for my setup. Vram 96MHz and core at 4-50MHz.(5-20W peaks, avg 7W). There are OC bugs for 79xx series that are still being worked on in kernel 6.8. It is connected with the DRM firmware value bits, the commands to alter values have changed and everyone is confused a bit. But using at stock everything works fine. Gaming wise Steam has done so much, for those titles I play currently I don't see problems. I am not fond of the RT IQ we are presented or the PR machinery telling us that upsacaling is better than native... yeah sure... just disable built in crap TAA. So upsacling is used only for rare emulation cases for me. The vaseline like IQ in recent years amuses me. I rather play jaggy no AA, but sharp. People say DLSS/DLAA does wonders, it is better than native, sure it is better if a smarter sharpening algo is cleaning after the current AA/Shader bullshit and created a problem that should not be there in the first place. RT wise the path tracing is better, sure... but at what price? 2K$ to play at ~30-60FPS with a 4090? No thanks.
I couldn't agree more! Upscaling is a tool to help when your GPU can't handle native rendering. Sure, DLAA is great, I guess, but we have RSR which does the same thing just without the overhyped green AI magic. Not that I'd ever want more than 3440x1440 native anyway.

Paying south of $2K for a graphics card is a definite no for me as well. Heck, I could get a Sapphire pulse 7900 XTX for 950 GBP, and even this feels like pulling teeth. I'm not sure if it's worth forking up the cash by selling the 7800 XT and some random stuff, and then watching its market value deteriorate when RDNA 4 is out. Maybe just waiting for RDNA 4 and using lower settings in the meantime would be a more sensible plan. I'm really in conflict here (first world problems, I know). :ohwell:
 
Joined
Nov 18, 2010
Messages
7,596 (1.48/day)
Location
Rīga, Latvia
System Name HELLSTAR
Processor AMD RYZEN 9 5950X
Motherboard ASUS Strix X570-E
Cooling 2x 360 + 280 rads. 3x Gentle Typhoons, 3x Phanteks T30, 2x TT T140 . EK-Quantum Momentum Monoblock.
Memory 4x8GB G.SKILL Trident Z RGB F4-4133C19D-16GTZR 14-16-12-30-44
Video Card(s) Sapphire Pulse RX 7900XTX. Water block. Crossflashed.
Storage Optane 900P[Fedora] + WD BLACK SN850X 4TB + 750 EVO 500GB + 1TB 980PRO+SN560 1TB(W11)
Display(s) Philips PHL BDM3270 + Acer XV242Y
Case Lian Li O11 Dynamic EVO
Audio Device(s) SMSL RAW-MDA1 DAC
Power Supply Fractal Design Newton R3 1000W
Mouse Razer Basilisk
Keyboard Razer BlackWidow V3 - Yellow Switch
Software FEDORA 41
Paying south of $2K for a graphics card is a definite no for me as well. Heck, I could get a Sapphire pulse 7900 XTX for 950 GBP, and even this feels like pulling teeth. I'm not sure if it's worth forking up the cash by selling the 7800 XT and some random stuff, and then watching its market value deteriorate when RDNA 4 is out. Maybe just waiting for RDNA 4 and using lower settings in the meantime would be a more sensible plan. I'm really in conflict here (first world problems, I know). :ohwell:

I justified to purchase because of Lightroom AI denoise speed, not gaming whatsoever. It is a must these days, as it allows me to shoot my A7m4 camera at much higher ISO, my previous card could do it in 50s on one image, this card does it it 7-10s multiple that by few hundred pics you take in one weekend trip if you refrain using burst shots. So I save up a lot of my almost non existent free time for hobbies and time is worth the money keeping my usual workflow, leaving overnight batch jobs ended up with much things to be redone or simply crashed, because it is Adobe.

I don't believe 7900XTX will fall below 900 Quid soon as the Supers are still expensive in EU and brang nothing really. It will sooner fall short on supply than being cheaper, they had their lesseon with 6xxx series overstock that hurt 7xxx series. That's why I picked up my card last summer, and I was right, the price didn't drop down. I got mine around 970Euro. Selling 7800XT is a great idea, no matter how you look at it. It will age terribly and loose value faster.

I don't have hopes for RDNA4 being faster in general. It will have much faster RT performance, for sure, there will be backports from their CDNA branch not that AMD cares for RT, it will be a byproduct from the development where the real cash flows in for them. Other than that... it will be cheaper for them. The power envelope is unknown, they will decide it as the last bit with the last beta silicon they will get in their labs and measure optimal settings, it has never been the other way around. If it clocks high using more power, they won't cripple it down for the sake of being less TDP, nobody knows really how the tech node will act with their final design, it is all lies we read now. Their money now was on the upcoming console refresh with the ~ RDNA3.0-3.5, and make that right. Nvidia users will sulk on that again. Somehow Sony and Microsoft also doesn't see any point in doing mainstream RT, I wonder why.

Well... considering the fact that photography is my hobby also... also film... one roll of E100 costs around 30Euro's, but if you like convert it into craft beer values, that's only like 3-4 beers... everything is expensive these days.
 
Joined
Jul 20, 2020
Messages
1,152 (0.71/day)
System Name Gamey #1 / #3
Processor Ryzen 7 5800X3D / Ryzen 7 5700X3D
Motherboard Asrock B450M P4 / MSi B450 ProVDH M
Cooling IDCool SE-226-XT / IDCool SE-224-XTS
Memory 32GB 3200 CL16 / 16GB 3200 CL16
Video Card(s) PColor 6800 XT / GByte RTX 3070
Storage 4TB Team MP34 / 2TB WD SN570
Display(s) LG 32GK650F 1440p 144Hz VA
Case Corsair 4000Air / TT Versa H18
Power Supply EVGA 650 G3 / EVGA BQ 500
Well, I don't think i'll bother with any sort of undervolting this time around. I'm seeing effective clocks of nearly 3.2Ghz at stock setup (apart from the fan curve tweak) which is just mind blowing to me:
View attachment 332428

The card is whisper quiet, cool and plays everything I can think to throw at it. Really happy with the purchase :)

FWIW when testing my Pulse 7700 XT (first mention in this thread??) it can boost to the same but as some have mentioned, it's load and power dependent. Ya also gotta set the Min and Max (OK and voltage) in Adrenalin properly too.

FH4 3174 MHz 7700 XT.PNG


This is multiple runs of the Forza Horizon 4 benchmark, 3174 MHz is during the last run and it still doesn't hit max power during most of the run and is above 3100 for the whole thing. The 4th from last one there is also over 3000, I was just ramping up at that point before testing some lower power settings.

On the other hand, CP2077 and Hogwarts Legacy only get to ~2800 with the same settings, pegged at max 262W power use. So very load-dependent. Seems like all the chiplet 7xxx GPUs perform similarly. I wonder if AMD was targetting 3000-3200 MHz and ended up surprised by excessive power usage in DX12 or higher loads so dialed things back.
 
Joined
Jan 14, 2019
Messages
12,586 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
FWIW when testing my Pulse 7700 XT (first mention in this thread??) it can boost to the same but as some have mentioned, it's load and power dependent. Ya also gotta set the Min and Max (OK and voltage) in Adrenalin properly too.

View attachment 333377

This is multiple runs of the Forza Horizon 4 benchmark, 3174 MHz is during the last run and it still doesn't hit max power during most of the run and is above 3100 for the whole thing. The 4th from last one there is also over 3000, I was just ramping up at that point before testing some lower power settings.

On the other hand, CP2077 and Hogwarts Legacy only get to ~2800 with the same settings, pegged at max 262W power use. So very load-dependent. Seems like all the chiplet 7xxx GPUs perform similarly. I wonder if AMD was targetting 3000-3200 MHz and ended up surprised by excessive power usage in DX12 or higher loads so dialed things back.
I don't think AMD "got surprised" by anything. It's just that the card's boost algorithm can detect what's best in every scenario. Hogwarts and CP are complex DX12 titles that also use some RT if enabled. I guess that uses more parts of the GPU than Forza does. With more parts active, there's less headroom for extra clock speed, just like your CPU boosts higher when unpacking files than in some complex AVX work. I'm pretty sure it's normal and accounted for.
 
Joined
Jul 20, 2020
Messages
1,152 (0.71/day)
System Name Gamey #1 / #3
Processor Ryzen 7 5800X3D / Ryzen 7 5700X3D
Motherboard Asrock B450M P4 / MSi B450 ProVDH M
Cooling IDCool SE-226-XT / IDCool SE-224-XTS
Memory 32GB 3200 CL16 / 16GB 3200 CL16
Video Card(s) PColor 6800 XT / GByte RTX 3070
Storage 4TB Team MP34 / 2TB WD SN570
Display(s) LG 32GK650F 1440p 144Hz VA
Case Corsair 4000Air / TT Versa H18
Power Supply EVGA 650 G3 / EVGA BQ 500
I don't think AMD "got surprised" by anything. It's just that the card's boost algorithm can detect what's best in every scenario. Hogwarts and CP are complex DX12 titles that also use some RT if enabled. I guess that uses more parts of the GPU than Forza does. With more parts active, there's less headroom for extra clock speed, just like your CPU boosts higher when unpacking files than in some complex AVX work. I'm pretty sure it's normal and accounted for.

Yeah FH4 is a light load on all GPUs, I tested AC:Ody with the Vulkan rendering mod and it also hit 3000+ with maxed out power which I hadn't seen on the card before, just around 2800 Max on my typical heavier DX12 games. W1zz mentioned in the review that he saw a large OC on the 7700 XTs to over 3000 MHz but that was also in an older, light benchmark: Unigine Heaven. This was something I hadn't seen at all until today so was a bit of a surprise.

In other cards with older architectures (Turing, Pascal, RDNA2) the card would go unstable when overclocked too far on a light load before the power limit was hit, I was surprised to see a different behavior here, to maintain stability up to almost 3200 MHz. Possibly higher, I never got it to crash as I was done at 3170.
 
Joined
Jan 14, 2019
Messages
12,586 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Yeah FH4 is a light load on all GPUs, I tested AC:Ody with the Vulkan rendering mod and it also hit 3000+ with maxed out power which I hadn't seen on the card before, just around 2800 Max on my typical heavier DX12 games. W1zz mentioned in the review that he saw a large OC on the 7700 XTs to over 3000 MHz but that was also in an older, light benchmark: Unigine Heaven. This was something I hadn't seen at all until today so was a bit of a surprise.

In other cards with older architectures (Turing, Pascal, RDNA2) the card would go unstable when overclocked too far on a light load before the power limit was hit, I was surprised to see a different behavior here, to maintain stability up to almost 3200 MHz. Possibly higher, I never got it to crash as I was done at 3170.
Those are super high clocks! :eek:

Maybe I should start tinkering with my 7800 XT. The only thing I've done so far is push the power limit to the max, which makes it more stable at 2400 MHz in every game instead of fluctuating based on different loads (2200 MHz in some games, 2300-2350 in others).

I wish it was as easy to overclock RDNA 3 as it is the 6500 XT. With that card, I can drag every slider to the max, and the card will hit a hard wall at 2950 MHz. Not that it matters much, though. An extra 10% on nothing is still nothing. :laugh:
 
Joined
Sep 3, 2019
Messages
3,582 (1.85/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 200W PPT limit, 80C temp limit, CO -6-14, +50MHz (up to 5.0GHz)
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F39b, AGESA V2 1.2.0.C
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off-center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3667MT/s 1.42V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~467W (382W current) PowerLimit, 1060mV, Adrenalin v24.12.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR400/1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, ATX v2.4, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v24H2, OSBuild 26100.2605), upgraded from Win10 to Win11 on Jan 2024
Some undervolt can raise clocks within the same power level.
Voltage slider in drivers, at least for RDNA3, is a kind of an offset curve setting close to what CurveOptimizer is for Ryzen.
And what (UV) can work with 1 game doesn't mean will work on another with different type load, just keep in mind
 
Joined
Jan 12, 2023
Messages
221 (0.31/day)
System Name IZALITH (or just "Lith")
Processor AMD Ryzen 7 7800X3D (4.2Ghz base, 5.0Ghz boost, -30 PBO offset)
Motherboard Gigabyte X670E Aorus Master Rev 1.0
Cooling Deepcool Gammaxx AG400 Single Tower
Memory Corsair Vengeance 64GB (2x32GB) 6000MHz CL40 DDR5 XMP (XMP enabled)
Video Card(s) PowerColor Radeon RX 7900 XTX Red Devil OC 24GB (2.39Ghz base, 2.56Ghz boost)
Storage 2x1TB SSD, 2x2TB SSD, 2x 8TB HDD
Display(s) Samsung Odyssey G51C 27" QHD (1440p 165Hz) + Samsung Odyssey G3 24" FHD (1080p 165Hz)
Case Corsair 7000D Airflow Full Tower
Audio Device(s) Corsair HS55 Surround Wired Headset/LG Z407 Speaker Set
Power Supply Corsair HX1000 Platinum Modular (1000W)
Mouse Logitech G502 X LIGHTSPEED Wireless Gaming Mouse
Keyboard Keychron K4 Wireless Mechanical Keyboard
Software Arch Linux
FWIW when testing my Pulse 7700 XT (first mention in this thread??) it can boost to the same but as some have mentioned, it's load and power dependent. Ya also gotta set the Min and Max (OK and voltage) in Adrenalin properly too.

View attachment 333377

This is multiple runs of the Forza Horizon 4 benchmark, 3174 MHz is during the last run and it still doesn't hit max power during most of the run and is above 3100 for the whole thing. The 4th from last one there is also over 3000, I was just ramping up at that point before testing some lower power settings.

On the other hand, CP2077 and Hogwarts Legacy only get to ~2800 with the same settings, pegged at max 262W power use. So very load-dependent. Seems like all the chiplet 7xxx GPUs perform similarly. I wonder if AMD was targetting 3000-3200 MHz and ended up surprised by excessive power usage in DX12 or higher loads so dialed things back.

Interesting because again, Reddit informed me that 3Ghz was a pipedream. At this point I have to assume they were either talking about non-AIB cards or (more likely) they were just dead wrong.
 
Joined
Jan 14, 2019
Messages
12,586 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Some undervolt can raise clocks within the same power level.
Voltage slider in drivers, at least for RDNA3, is a kind of an offset curve setting close to what CurveOptimizer is for Ryzen.
And what (UV) can work with 1 game doesn't mean will work on another with different type load, just keep in mind
Oh no, that's way too much rocket science for me! :fear:

To be honest, I'm only curious what the card has left in it without too much messing around. If it gives me some extra stability in Avatar: FoP, that'll be a nice bonus. :)
 
Joined
Jan 19, 2023
Messages
247 (0.35/day)
Interesting because again, Reddit informed me that 3Ghz was a pipedream. At this point I have to assume they were either talking about non-AIB cards or (more likely) they were just dead wrong.
I mean it depends. If we are talking about stock so limited to 360W then yeah reaching that 3GHz might be harder or impossible. Even on my Nitro+ with PL raised to 460W and UV of 1085mv and max clocks of 3200MHz it rarely reaches 3GHz. In most games sits between 2700-2900.
 
Joined
Sep 3, 2019
Messages
3,582 (1.85/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 200W PPT limit, 80C temp limit, CO -6-14, +50MHz (up to 5.0GHz)
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F39b, AGESA V2 1.2.0.C
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off-center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3667MT/s 1.42V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~467W (382W current) PowerLimit, 1060mV, Adrenalin v24.12.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR400/1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, ATX v2.4, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v24H2, OSBuild 26100.2605), upgraded from Win10 to Win11 on Jan 2024
Done some simple tests with Superposition benchmark regarding UV and PowerLimit.
I cant test the card with max power level (460+) because of my 750W PSU so I did it at 406W max (default PL on Primary BIOS)

3 runs, 10+min each, with 2 different voltage levels and 2 different power levels.

1st run
1707220127778.png

Result:
Avg voltage: 0.964v
Avg power: 402W
Avg FPS: 108

Highlighted some interesting points

Untitled_69_b.png

-------------------------------------------------

2nd run
Reduced voltage level from 1150 >> 1050
Same PL

Result:
Avg voltage: 0.940v (-24mv)
Avg power: 402W
Avg FPS: 114 (+5.5%)

Untitled_70_b.png

-------------------------------------------------

3rd run
Reduced voltage level from 1150 >> 1050
Reduced power and trying to much FPS from 1st run

Result against 1st run
Avg voltage: 0.897v (-67mv)
Avg power: 362W (-10%)
Avg FPS: 111.5 (+3%)

Untitled_71_b.png

--------------------------------------------------

This is an indication of how much the GPU is pushed to the edge and over its efficiency curve "just" for a small percentage of performance.
With -100mv on the votage slider GPU sustained +200MHz on clocks and 5+% on FPS
Additionally with -10% power limit GPU still sustained +100MHz on clocks and 3+% on FPS, let alone the slightly better thermals and acoustics.

I should point out also that this will not necessarily reflect exact game performance and conditions in all games.
This was a "steady environment" with no movement at all and not in the super high load variation that games have.
 
Last edited:
Joined
Sep 21, 2020
Messages
1,677 (1.08/day)
Processor 5800X3D -30 CO
Motherboard MSI B550 Tomahawk
Cooling DeepCool Assassin III
Memory 32GB G.SKILL Ripjaws V @ 3800 CL14
Video Card(s) ASRock MBA 7900XTX
Storage 1TB WD SN850X + 1TB ADATA SX8200 Pro
Display(s) Dell S2721QS 4K60
Case Cooler Master CM690 II Advanced USB 3.0
Audio Device(s) Audiotrak Prodigy Cube Black (JRC MUSES 8820D) + CAL (recabled)
Power Supply Seasonic Prime TX-750
Mouse Logitech Cordless Desktop Wave
Keyboard Logitech Cordless Desktop Wave
Software Windows 10 Pro
In my testing there seems to be a big discrepancy in the maximum reported GPU clock between different tools. For instance, my stock MBA 7900XTX regularly goes over 3 GHz and even approaches 3200 MHz in some games, according to HWinfo's Effective Front End Clock.

But in these same games, tested with the same settings, max recorded clock rarely exceeds 2900 MHz when viewing CapFrameX logs :rolleyes:
 
Joined
Jan 8, 2017
Messages
9,507 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
But in these same games, tested with the same settings, max recorded clock rarely exceeds 2900 MHz
Because that's the stock frequency limit on a 7900XTX.

AMD's frequency limit doesn't work like people think it does, it's literally the same as a power limit, momentarily allowing the GPU to exceed it and then paring back the clocks. The GPU always goes up to the highest possible frequency until it hits some imposed limit. This is not the same as Nvidia, where they use a fixed frequency/voltage curve and depending on the limits the core falls somewhere along that curve, that's also why there is no frequency limit with those cards because the clocks are always bounded by the curve.

As far as I can tell most programs including AMD's own software display the effective shader clock which usually reports the lowest clocks of all and I think that's also what the "max frequency limit" keeps track of. I think I posted on this thread a screenshot of my 7900XT hitting 3.2 Ghz on the shader clock with an unstable undervolt, the only limit in practice these cards have is the power limit.
 
Last edited:
Joined
Sep 3, 2019
Messages
3,582 (1.85/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 200W PPT limit, 80C temp limit, CO -6-14, +50MHz (up to 5.0GHz)
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F39b, AGESA V2 1.2.0.C
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off-center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3667MT/s 1.42V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~467W (382W current) PowerLimit, 1060mV, Adrenalin v24.12.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR400/1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, ATX v2.4, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v24H2, OSBuild 26100.2605), upgraded from Win10 to Win11 on Jan 2024
In my testing there seems to be a big discrepancy in the maximum reported GPU clock between different tools. For instance, my stock MBA 7900XTX regularly goes over 3 GHz and even approaches 3200 MHz in some games, according to HWinfo's Effective Front End Clock.

But in these same games, tested with the same settings, max recorded clock rarely exceeds 2900 MHz when viewing CapFrameX logs :rolleyes:
Are you seeing the same front end clock?
If you are seeing a "GPU clock" then its the shader clock and not the front end.
What you see in drivers also is shader clock.

Is shader clock on HWiNFO matching you other software reading?
 
Joined
Jun 2, 2017
Messages
9,374 (3.40/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
Because that's the stock frequency limit on a 7900XTX.

AMD's frequency limit doesn't work like people think it does, it's literally the same as a power limit, momentarily allowing the GPU to exceed it and then paring back the clocks. The GPU always goes up to the highest possible frequency until it hits some imposed limit. This is not the same as Nvidia, where they use a fixed frequency/voltage curve and depending on the limits the core falls somewhere along that curve, that's also why there is no frequency limit with those cards because the clocks are always bounded by the curve.

As far as I can tell most programs including AMD's own software display the effective shader clock which usually reports the lowest clocks of all and I think that's also what the "max frequency limit" keeps track of. I think I posted on this thread a screenshot of my 7900XT hitting 3.2 Ghz on the shader clock with an unstable undervolt, the only limit in practice these cards have is the power limit.
I have noticed that Console ports run at a consistent clock speed but lower than some older Games. I have seen high boost clocks on older optimized Games like Sleeping Dogs. The thing is these cards are more powerful than we give them credit for. I when I started playing AMS2 I noticed that the clock was like 1580 as I had Vsync on and 144 Hz was the same FPS.
 
Joined
Dec 26, 2012
Messages
1,137 (0.26/day)
Location
Babylon 5
System Name DaBeast!!! DaBeast2!!!
Processor AMD AM4 Ryzen 7 5700X3D 8C/16T/AMD AM4 RYZEN 9 5900X 12C/24T
Motherboard Gigabyte X570 Aorus Xtreme/Gigabyte X570S Aorus Elite AX
Cooling Thermaltake Water 3.0 360 AIO/Thermalright PA 120 SE
Memory 2x 16GB Corsair Vengeance RGB RT DDR4 3600C16/2x 16GB Patriot Elite II DDR4 4000MHz
Video Card(s) XFX MERC 310 RX 7900 XTX 24GB/Sapphire Nitro+ RX 6900 XT 16GB
Storage 500GB Crucial P3 Plus NVMe PCIe 4x4 + 4TB Lexar NM790 NVMe PCIe 4x4 + TG Cardea Zero Z NVMe PCIe 4x4
Display(s) Samsung LC49HG90DMEX 32:9 144Hz Freesync 2/Acer XR341CK 75Hz 21:9 Freesync
Case CoolerMaster H500M/SOLDAM XR-1
Audio Device(s) iFi Micro iDSD BL + Philips Fidelio B97/FostexHP-A4 + LG SP8YA
Power Supply Corsair HX1000 Platinum/Enermax MAXREVO 1500
Mouse Logitech G303 Shroud Ed/Logitech G603 WL
Keyboard Logitech G915/Keychron K2
Software Win11 Pro/Win11 Pro
Oops, wrong thread, sorry 'bout that! :oops:
 
Last edited:
Joined
Jan 14, 2019
Messages
12,586 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Because that's the stock frequency limit on a 7900XTX.

AMD's frequency limit doesn't work like people think it does, it's literally the same as a power limit, momentarily allowing the GPU to exceed it and then paring back the clocks. The GPU always goes up to the highest possible frequency until it hits some imposed limit. This is not the same as Nvidia, where they use a fixed frequency/voltage curve and depending on the limits the core falls somewhere along that curve, that's also why there is no frequency limit with those cards because the clocks are always bounded by the curve.

As far as I can tell most programs including AMD's own software display the effective shader clock which usually reports the lowest clocks of all and I think that's also what the "max frequency limit" keeps track of. I think I posted on this thread a screenshot of my 7900XT hitting 3.2 Ghz on the shader clock with an unstable undervolt, the only limit in practice these cards have is the power limit.
This!

In short, Nvidia operates at the top of the voltage-frequency curve, unless the card hits a power or thermal limit. AMD, on the other hand, operates at a power limit, unless the card hits a thermal wall or it maxes out the allowed voltage-frequency limits/offsets.

That's why Nvidia cards operate at a relatively constant clock speed at various power consumption levels within the power and thermal headroom. AMD cards operate at maximum power consumption with more fluctuating clock speeds depending on the load.
 
Top