• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

RX 5700 XT - 75 Hz/144 Hz/non-standard refresh rates causing VRAM clock to run at max

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.04/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
As another data point, my 6600xt has clocked down properly at all resolutions since I got it about 2 months ago, but that's only through 3 or so driver updates so far. And the idle power draw is ridiculously low at 4W. My gaming PC's total idle power usage is 24-25W from the wall after this change. Crazy low.
It makes sense since the 6600 XT is newer than the 6900 XT.
 
Joined
May 2, 2017
Messages
7,762 (2.80/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
As another data point, my 6600xt has clocked down properly at all resolutions since I got it about 2 months ago, but that's only through 3 or so driver updates so far. And the idle power draw is ridiculously low at 4W. My gaming PC's total idle power usage is 24-25W from the wall after this change. Crazy low.
Yeah, RDNA2 is ridiculously efficient, and idle power for the smaller cards is fantastic. Really promising for the efficiency of future architectures as well - AMD catching up with and then surpassing Nvidia in GPU efficiency in just two generations is really impressive.
It makes sense since the 6600 XT is newer than the 6900 XT.
Shouldn't be a meaningful difference - those dice were taped out within a few months of each other, and it's highly unlikely that the newer die has any notable hardware changes. Most likely some driver tweak that applied to the 6900 XT didn't apply to the 6600 XT, which can be down to pretty much anything.
 

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.04/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
Most likely some driver tweak that applied to the 6900 XT didn't apply to the 6600 XT, which can be down to pretty much anything.
The other way around, otherwise I agree.

With RDNA 3 it is possible that AMD will for the first time since R9 290X catch the performance crown again.
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.93/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
Encounter monitor bugs
Change drivers to run higher clocks as short term fix
patch in fixes to various monitors/connection types over time
You are here
Release new products with hardware mitigation for the issue (seems like 6000 series cards have this, tbh)

Nvidia went through this too, high refresh rate high res monitors change things. I bet even Nv would have higher idle at 4k120
 

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.04/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
Nvidia went through this too, high refresh rate high res monitors change things. I bet even Nv would have higher idle at 4k120
my former 1080 Ti didn't even care when I had 1440p144 Hz + 1080p120 Hz connected. It only increased clocks when I streamed something, and then also not to full clocks.
Release new products with hardware mitigation for the issue (seems like 6000 series cards have this, tbh)
Very possible and at least it had more mature drivers from the get go, compared to older RDNA 2 graphics.
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.93/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
my former 1080 Ti didn't even care when I had 1440p144 Hz + 1080p120 Hz connected. It only increased clocks when I streamed something, and then also not to full clocks.

Very possible and at least it had more mature drivers from the get go, compared to older RDNA 2 graphics.
Nv had issues with high refresh HDMI, and upped their displayport versions to fix it

My brother has a 165Hz Gsync, and on his GTX 1080 165Hz caused it to clock up, 144 did not
 

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.04/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
165 Hz is the most unnecessary "Hz increase" ever anyway, comparable to 120 -> 144. Useless, no tangible benefits. More worries about hitting FPS targets.
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.93/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
165 Hz is the most unnecessary "Hz increase" ever anyway, comparable to 120 -> 144. Useless, no tangible benefits. More worries about hitting FPS targets.
You dare insult my glorious refresh rate?
(Nah i totally agree)
I only went 165Hz because i figured even running at 144 or 120, the higher refresh models would likely have newer tech, and not be a rehashed 5 year old design
 

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.04/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
You dare insult my glorious refresh rate?
(Nah i totally agree)
I only went 165Hz because i figured even running at 144 or 120, the higher refresh models would likely have newer tech, and not be a rehashed 5 year old design
Plus, it really is not a problem with a 3090. :laugh: I will probably skip the GPUs this year, my 2080 Ti is clocked so high, it's nearly as fast as a 3080, so it doesn't feel "dated" at all.
 

Cheeseball

Not a Potato
Supporter
Joined
Jan 2, 2009
Messages
2,003 (0.34/day)
Location
Pittsburgh, PA
System Name Titan
Processor AMD Ryzen™ 7 7950X3D
Motherboard ASRock X870 Taichi Lite
Cooling Thermalright Phantom Spirit 120 EVO CPU
Memory TEAMGROUP T-Force Delta RGB 2x16GB DDR5-6000 CL30
Video Card(s) ASRock Radeon RX 7900 XTX 24 GB GDDR6 (MBA) / NVIDIA RTX 4090 Founder's Edition
Storage Crucial T500 2TB x 3
Display(s) LG 32GS95UE-B, ASUS ROG Swift OLED (PG27AQDP), LG C4 42" (OLED42C4PUA)
Case HYTE Hakos Baelz Y60
Audio Device(s) Kanto Audio YU2 and SUB8 Desktop Speakers and Subwoofer, Cloud Alpha Wireless
Power Supply Corsair SF1000L
Mouse Logitech Pro Superlight 2 (White), G303 Shroud Edition
Keyboard Wooting 60HE+ / 8BitDo Retro Mechanical Keyboard (N Edition) / NuPhy Air75 v2
VR HMD Occulus Quest 2 128GB
Software Windows 11 Pro 64-bit 23H2 Build 22631.4317
Apologies to bump this thread, but I would like to add this since I am using a RX 6900 XT (for games) and RTX 3090 (for work/ML). I have taken the RTX 3090 out since I'm on vacation and solely play Apex Legends (which runs "better" on the RX 6900 XT since its Source-based).

What I found is that if one side of my monitor (Samsung Odyssey G9 Neo, running in PDP 1440p120 for both sides) is running at 10 bpc Color Depth (DisplayPort) and the other is running at 8 bpc which is its max (HDMI 2.0 in PDP mode/2.1 solo). The AMD driver does not downclock the VRAM in idle if the Color Depths are different, so setting my DP side of the screen to 8 bpc to match the other side will allow it to idle properly.

1658356133292.png

1658356153862.png


This issue does not occur on my RTX 3090, as apparently the driver supports having different Color Depths for each monitor. I would run both on DisplayPort, but the G9 Neo only has 1 DP port and 2 HDMI 2.1 ports, so I have to use a DP 1.4a to HDMI 2.1 converter (I have sole HDMI connector on the video card connected to a HDTV).
 
Joined
Dec 12, 2020
Messages
1,755 (1.21/day)
I am running my 5700XT with a PG279Q at 144hz with no problem, the vram stays at 100mhz when in idle
Didn't try with newer drivers... i am still with 19.9.1
I thought the Asus PG279q only supports Gsync? That's all mine supports, did they come out with a freesync version later? Or maybe a variant that supports both freesync and Gsync?
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.93/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
Apologies to bump this thread, but I would like to add this since I am using a RX 6900 XT (for games) and RTX 3090 (for work/ML). I have taken the RTX 3090 out since I'm on vacation and solely play Apex Legends (which runs "better" on the RX 6900 XT since its Source-based).

What I found is that if one side of my monitor (Samsung Odyssey G9 Neo, running in PDP 1440p120 for both sides) is running at 10 bpc Color Depth (DisplayPort) and the other is running at 8 bpc which is its max (HDMI 2.0 in PDP mode/2.1 solo). The AMD driver does not downclock the VRAM in idle if the Color Depths are different, so setting my DP side of the screen to 8 bpc to match the other side will allow it to idle properly.

View attachment 255405
View attachment 255406

This issue does not occur on my RTX 3090, as apparently the driver supports having different Color Depths for each monitor. I would run both on DisplayPort, but the G9 Neo only has 1 DP port and 2 HDMI 2.1 ports, so I have to use a DP 1.4a to HDMI 2.1 converter (I have sole HDMI connector on the video card connected to a HDTV).
10 bit uses more bandwidth, it could be as simple as dropping the bandwidth gets you under the threshold to raise clocks

what happens if you try 6/6 and 6/8? (apart from looking terrible)
 

Cheeseball

Not a Potato
Supporter
Joined
Jan 2, 2009
Messages
2,003 (0.34/day)
Location
Pittsburgh, PA
System Name Titan
Processor AMD Ryzen™ 7 7950X3D
Motherboard ASRock X870 Taichi Lite
Cooling Thermalright Phantom Spirit 120 EVO CPU
Memory TEAMGROUP T-Force Delta RGB 2x16GB DDR5-6000 CL30
Video Card(s) ASRock Radeon RX 7900 XTX 24 GB GDDR6 (MBA) / NVIDIA RTX 4090 Founder's Edition
Storage Crucial T500 2TB x 3
Display(s) LG 32GS95UE-B, ASUS ROG Swift OLED (PG27AQDP), LG C4 42" (OLED42C4PUA)
Case HYTE Hakos Baelz Y60
Audio Device(s) Kanto Audio YU2 and SUB8 Desktop Speakers and Subwoofer, Cloud Alpha Wireless
Power Supply Corsair SF1000L
Mouse Logitech Pro Superlight 2 (White), G303 Shroud Edition
Keyboard Wooting 60HE+ / 8BitDo Retro Mechanical Keyboard (N Edition) / NuPhy Air75 v2
VR HMD Occulus Quest 2 128GB
Software Windows 11 Pro 64-bit 23H2 Build 22631.4317
10 bit uses more bandwidth, it could be as simple as dropping the bandwidth gets you under the threshold to raise clocks

what happens if you try 6/6 and 6/8? (apart from looking terrible)

I figured it was a bandwidth limitation because if I set both monitors to 60 Hz, it exposes the 10-bit option for both of them, but if I set the HDMI connected display to 120 Hz, I only have the option for 8 bpc. On the DisplayPort connected one it has the choices of 6, 8 and 10 bpc, since DP normally has the bandwidth for it.

Interestingly, while in 60 Hz on the HDMI connected display, it has 8, 10 & 12 bpc, no 6 bpc:

1658423985671.png


60 Hz on the DisplayPort one it has 6, 8 and 10 bpc:

1658424025657.png


A mix of 6 bpc and 8 bpc also causes the VRAM clocks to shoot up:

1658424308553.png
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.93/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
Part of what i read and recall on this, was that at a GPU level you have clock generators for the displays.
This is from fuzzy memory so i'm sure i'll get terminology wrong, at least.

VGA, DVI and HDMI below 1.2 had a different clock rate depending on the data being sent - DP had a fixed rate, not a varying one.
Something about DP being packet based vs the others being more like VGA with everything being about timings for scan rates for horizontal, vertical etc.

A lot of video cards shared the clock gens, so you could use 3x native DP + HDMI + DVI, but if you used passive DP to anything else, you'd find yourself limited to just 3 total displays

Something i've forgotten here except the final part is that using all the bandwidth of DP (natively or via adaptors) used more hardware of the card instead of those ready-made clock gens, used more of the card (specifically VRAM) and required the clocks to be higher at idle to avoid issues

Comments like this from Toasty (author of CRU) ties into the probably important bits i've forgotten: 'VRAM has to update faster than the monitor does, you'll get flickering or blackouts on the screen'
1658448744511.png


This guy found the limits of how many older displays a 5700xt can support: (two, and he tried 3)
DP to DVI adapter not working on Radeon RX 5700 XT - AMD Community



The things i'm not sure of is:
Is this fix based on assumptions for the older display types, or is it neccesary on DP too?
Does DP and modern HDMI's fixed data-rate (instead of variable) just assume it needs the max possible bandwidth and clock the VRAM up just in case?
I know nvidia ditched VGA support ASAP, is it cards with support for older standards that have to do this?
Is this something that changed VRAM tech solves? Is it simply that GDDR6x cards can handle X bandwidth while GDDR6 or GDDR5 cant? (testing a 1070ti vs a 1080 would cover this, as the key difference there is their VRAM)

This could be a drawback of maintaining legacy support for older monitor tech, or it could be a software thing to avoid issues with certain combinations of displays and adaptors being triggered when it's not needed
 
Last edited:
Joined
Nov 11, 2019
Messages
20 (0.01/day)
I don't think the packet based thing ends up making much difference in the end. If you think about it, in theory perhaps you could burst a whole frame into a buffer in the monitor and then have time to do whatever you like for a while. But in order to send it down in say, half the time, you would need double the bandwidth. Well manufacturers are going to provision enough bandwidth and no more (or perhaps even slightly less than enough, leading to this thread). A lot of other problems too, like getting everyone to play ball with your scheme.

So at the end of the day you have just enough time to send everything along, packet based or not.

Is this fix based on assumptions for the older display types, or is it neccesary on DP too?
It's necessary any time there is insufficient time to change the DDR speed during the blanking between frames. It doesn't really matter what link you are using.

Is this something that changed VRAM tech solves? Is it simply that GDDR6x cards can handle X bandwidth while GDDR6 or GDDR5 cant? (testing a 1070ti vs a 1080 would cover this, as the key difference there is their VRAM)
It's not about the bandwidth, it's about having sufficient real time to change clockspeeds. If a new RAM tech solves it it will because they found a way to change speeds ultra fast.

I'm not sure what AMD/nvidia are doing in their drivers/products to deal with this, it could be any number of tricks and they never talk about this issue. As it stands the problem will continue to get more and more prevalent and harder to solve as monitor refresh rates increase. Higher the refresh, the less time between frames to change clock speeds. Once you go high enough it doesn't matter how much blanking you add in with CRU there will just never be enough time.

The permanent solution would be some sort of hardware buffers. Like a pair of 80MB RAM chips on the graphics card that are just purely there to hold the output frame, so the display out can read from that while the main GDDR does whatever it likes.
 
Joined
Dec 12, 2020
Messages
1,755 (1.21/day)
I don't think the packet based thing ends up making much difference in the end. If you think about it, in theory perhaps you could burst a whole frame into a buffer in the monitor and then have time to do whatever you like for a while. But in order to send it down in say, half the time, you would need double the bandwidth. Well manufacturers are going to provision enough bandwidth and no more (or perhaps even slightly less than enough, leading to this thread). A lot of other problems too, like getting everyone to play ball with your scheme.

So at the end of the day you have just enough time to send everything along, packet based or not.


It's necessary any time there is insufficient time to change the DDR speed during the blanking between frames. It doesn't really matter what link you are using.


It's not about the bandwidth, it's about having sufficient real time to change clockspeeds. If a new RAM tech solves it it will because they found a way to change speeds ultra fast.

I'm not sure what AMD/nvidia are doing in their drivers/products to deal with this, it could be any number of tricks and they never talk about this issue. As it stands the problem will continue to get more and more prevalent and harder to solve as monitor refresh rates increase. Higher the refresh, the less time between frames to change clock speeds. Once you go high enough it doesn't matter how much blanking you add in with CRU there will just never be enough time.

The permanent solution would be some sort of hardware buffers. Like a pair of 80MB RAM chips on the graphics card that are just purely there to hold the output frame, so the display out can read from that while the main GDDR does whatever it likes.
Would dual-ported WRAM help with this issue?
 
Joined
Nov 11, 2019
Messages
20 (0.01/day)
I suppose it could help if you used it for the buffer chip. It might let you use half as much if it saves you from needing to double buffer.
 
Top