• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 7800 XT

Joined
Jun 2, 2017
Messages
9,729 (3.46/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
Another one not even trying to read. I put zero opinion in any of that.

In Cyberpunk? On a 7900 XT? It's impossible on ANY GPU. Even 5090 can't do that, it's gonna be hard stuck at double digits. If you have Hyper RX enabled then it's not native, you have massive FSR included. Hello, fellow 720p gamer!

Literally any title with ray tracing being the only available option clearly shows that AMD GPUs trail behind Team Green pricesakes. Especially at the highest end where VRAM amount is redundant on whatever model. Avatar, for example, and SH2 IIRC. It's not strictly "absolutely sucks" but still bad.

Do you realise you compare a GPU that has a competitor to a GPU that hasn't? Compare to a 4070 Ti or maybe Super if we got a 7900 XTX that overclocks well enough, that's where 7900 XTX lies if we take RT seriously. Or 4080 if we do not. 4090 is a league of its own, it's allowed ANY price that's lower than whatever NV got in stock for even more advanced customers like these RTX 6000 GPUs.

Placebo effect. Colours are identical if we talk identical displays and identical properly working cables. If you talk "Vivid gaming" thing from the Adrenalin app it's arguably useful. Never found these oversaturated colours any appealing. Just calibrate your monitor properly and run games at default colours, unless you're a daltonic.
We could go on but do you realize that what you are saying is that my experience has not been my experience. That is not with Hyper RX. Is Gaming not about math anymore? Nope no FSR either but it does not matter. It is my experience and if you don't want to believe it that is your prerogative.

Well I am not going to change my position on the China effect for 4090s. Do you remember when Nvidia shifted supply to their Eastern vendors? Then made the 4090D to keep the sales going?

If you go back and read the review of the 7900XTX you will realize that that 4070TI argument is just the narrative talking. Yes the 6000s cards are good but my 6800XT cannot even talk to my 7900XT. Once again it is math but you will go on.

Colours are not identical. Vivid Gaming? Like you don't have total control of your monitor using AMD software. Yeah it is that flower Icon in the top Right corner of AMD software. Calibrate your monitor? Is this 2012. Mini LED and OLED are not calibrated. But if you must AMD can adjust the contrast, saturation, Red-Green ratio, Brightness, Color Temp and another one I can't remember now as I have been drinking. There is no placebo effect if all of the PCs are connected to the same display. Even my TV shows the same thing. I know that according to the narrative AMD can't be better than Nvidia at anything so it would hard for you to believe.
 
Joined
Jun 14, 2020
Messages
4,297 (2.52/day)
System Name Mean machine
Processor AMD 6900HS
Memory 2x16 GB 4800C40
Video Card(s) AMD Radeon 6700S
Yeah, definitely noticed amd have better colors. A 4090 can't display colors properly... :roll: :roll:
 
Joined
Jan 14, 2019
Messages
14,246 (6.42/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case It's not about size, but how you use it
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
Another one not even trying to read. I put zero opinion in any of that.
That's what you believe. The fact that I disagree with it proves that it is opinion, nothing else.

Edit: This is the problem with the "Nvidia is king" (virus) mindset. People believe it like it was fact when it actually isn't.

In Cyberpunk? On a 7900 XT? It's impossible on ANY GPU. Even 5090 can't do that, it's gonna be hard stuck at double digits. If you have Hyper RX enabled then it's not native, you have massive FSR included. Hello, fellow 720p gamer!
What are you on about? I can play Cyberpunk on a 6750 XT at 1440 UW native just fine. All I have to do is turn RT off. 4K should be a breeze for a 7900 XT(X).

Literally any title with ray tracing being the only available option clearly shows that AMD GPUs trail behind Team Green pricesakes. Especially at the highest end where VRAM amount is redundant on whatever model. Avatar, for example, and SH2 IIRC. It's not strictly "absolutely sucks" but still bad.
That I agree with.

Do you realise you compare a GPU that has a competitor to a GPU that hasn't? Compare to a 4070 Ti or maybe Super if we got a 7900 XTX that overclocks well enough, that's where 7900 XTX lies if we take RT seriously. Or 4080 if we do not. 4090 is a league of its own, it's allowed ANY price that's lower than whatever NV got in stock for even more advanced customers like these RTX 6000 GPUs.
So just because the 4090 is ~30% faster than the 7900 XTX, Nvidia can ask for any price they want? Even 100% more?

Placebo effect. Colours are identical if we talk identical displays and identical properly working cables. If you talk "Vivid gaming" thing from the Adrenalin app it's arguably useful. Never found these oversaturated colours any appealing. Just calibrate your monitor properly and run games at default colours, unless you're a daltonic.
I've had issues with Nvidia cards not displaying proper 8-bit colours with full dynamic range on my 4K 60 Hz TV. It works with 4K 30 Hz, but as soon as I set it to 60 Hz, I only get limited dynamic range. I've tried a 1050 Ti, a 1660 Ti and a 2070, all of which are supposed to support 4K 60 Hz, but none of them worked with full dynamic range. I didn't have this issue with my Intel Xe iGPU or any RDNA 2 card I have on hand.
 
Joined
Aug 13, 2009
Messages
3,395 (0.60/day)
Location
Czech republic
Processor Ryzen 5800X
Motherboard Asus TUF-Gaming B550-Plus
Cooling Noctua NH-U14S
Memory 32GB G.Skill Trident Z Neo F4-3600C16D-32GTZNC
Video Card(s) Sapphire AMD Radeon RX 7900 XTX Nitro+
Storage HP EX950 512GB + Samsung 970 PRO 1TB
Display(s) Cooler Master GP27Q
Case Fractal Design Define R6 Black
Audio Device(s) Creative Sound Blaster AE-5
Power Supply Seasonic PRIME Ultra 650W Gold
Mouse Roccat Kone AIMO Remastered
Software Windows 10 x64
What is this shit I don't even... Stop the colour wars lol.

I have just snatched 6 months old Sapphire 7900 XTX from a friend streamer's friend for €634 equivalent. That good I presume.
 
Joined
Feb 22, 2009
Messages
780 (0.13/day)
Processor Ryzen 7 5700X3D
Motherboard Asrock B550 PG Velocita
Cooling Thermalright Silver Arrow 130
Memory G.Skill 4000 MHz DDR4 32 GB
Video Card(s) XFX Radeon RX 7800XT 16 GB
Storage Plextor PX-512M9PEGN 512 GB
Display(s) 1920x1200; 100 Hz
Case Fractal Design North XL
Audio Device(s) SSL2
Software Windows 10 Pro 22H2
Benchmark Scores i've got a shitload of them in 15 years of TPU membership
Someone please explain the logic behind the overclocking/undervolting of this video card. I've seen many reviewers use a range from 500 MHz to 3000 MHz for RX 7800XT as a standard.

1. The card will not go above 3000 MHz even with well tuned undervolting - so why then use 5000 MHz just because the graph allows it to reach that number?
2. Why use 2800 MHz for minimum frequency when you negate your power savings? Does not 2800 MHz mean that the card will always be working at 100 % with no power saving?

The 2800 MHz low mark is particularly bewildering to me... It was not explained. Can someone explain the reviewed and observed differences in behavior of RX 7800XT working clock speeds when using TPU 2800/5000 MHz settings versus common 500/3000 MHz settings? I have not done any proper undervolting since 2017 when i had my RX Vega 56 .

I now struggle with my RX 7800XT - i can not get it stable below 1025 mV, though from what i see this card should run fine at 970 mV. SERIOUSLY WTF? Is my sample that bad?
 
Joined
Apr 30, 2011
Messages
2,732 (0.54/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
Someone please explain the logic behind the overclocking/undervolting of this video card. I've seen many reviewers use a range from 500 MHz to 3000 MHz for RX 7800XT as a standard.

1. The card will not go above 3000 MHz even with well tuned undervolting - so why then use 5000 MHz just because the graph allows it to reach that number?
2. Why use 2800 MHz for minimum frequency when you negate your power savings? Does not 2800 MHz mean that the card will always be working at 100 % with no power saving?

The 2800 MHz low mark is particularly bewildering to me... It was not explained. Can someone explain the reviewed and observed differences in behavior of RX 7800XT working clock speeds when using TPU 2800/5000 MHz settings versus common 500/3000 MHz settings? I have not done any proper undervolting since 2017 when i had my RX Vega 56 .

I now struggle with my RX 7800XT - i can not get it stable below 1025 mV, though from what i see this card should run fine at 970 mV. SERIOUSLY WTF? Is my sample that bad?
Check the GPUZ voltage to be sure of what is the real time numbers. What the driver limit is set to, might be very different from that. RDNA2 behaves that way.
 
Joined
Jun 2, 2017
Messages
9,729 (3.46/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
Someone please explain the logic behind the overclocking/undervolting of this video card. I've seen many reviewers use a range from 500 MHz to 3000 MHz for RX 7800XT as a standard.

1. The card will not go above 3000 MHz even with well tuned undervolting - so why then use 5000 MHz just because the graph allows it to reach that number?
2. Why use 2800 MHz for minimum frequency when you negate your power savings? Does not 2800 MHz mean that the card will always be working at 100 % with no power saving?

The 2800 MHz low mark is particularly bewildering to me... It was not explained. Can someone explain the reviewed and observed differences in behavior of RX 7800XT working clock speeds when using TPU 2800/5000 MHz settings versus common 500/3000 MHz settings? I have not done any proper undervolting since 2017 when i had my RX Vega 56 .

I now struggle with my RX 7800XT - i can not get it stable below 1025 mV, though from what i see this card should run fine at 970 mV. SERIOUSLY WTF? Is my sample that bad?
Check the thermal pads and paste. If you have had this card from day one a heavy OC can cause some of those to fail.
 
Joined
Feb 22, 2009
Messages
780 (0.13/day)
Processor Ryzen 7 5700X3D
Motherboard Asrock B550 PG Velocita
Cooling Thermalright Silver Arrow 130
Memory G.Skill 4000 MHz DDR4 32 GB
Video Card(s) XFX Radeon RX 7800XT 16 GB
Storage Plextor PX-512M9PEGN 512 GB
Display(s) 1920x1200; 100 Hz
Case Fractal Design North XL
Audio Device(s) SSL2
Software Windows 10 Pro 22H2
Benchmark Scores i've got a shitload of them in 15 years of TPU membership
Check the GPUZ voltage to be sure of what is the real time numbers. What the driver limit is set to, might be very different from that. RDNA2 behaves that way.

I ran Furmark. Both GPU-Z and HW Monitor reported an average of 0.96 Volts being delivered during the run, while AMD Adrenalin was set to 1.0 V. Furmark did not crash.

IMPORTANT!

Does this mean that Wizzard's fixed minimum voltage comparison table below of different RX 7800XT & RX 7700XT cards was based on GPU-Z rather than simply applied numbers in Adrenalin, MSI Afterburner, Rivatuner etc?
 

Attachments

  • RX 7800XT OC.jpg
    RX 7800XT OC.jpg
    147.4 KB · Views: 11
Joined
Aug 13, 2009
Messages
3,395 (0.60/day)
Location
Czech republic
Processor Ryzen 5800X
Motherboard Asus TUF-Gaming B550-Plus
Cooling Noctua NH-U14S
Memory 32GB G.Skill Trident Z Neo F4-3600C16D-32GTZNC
Video Card(s) Sapphire AMD Radeon RX 7900 XTX Nitro+
Storage HP EX950 512GB + Samsung 970 PRO 1TB
Display(s) Cooler Master GP27Q
Case Fractal Design Define R6 Black
Audio Device(s) Creative Sound Blaster AE-5
Power Supply Seasonic PRIME Ultra 650W Gold
Mouse Roccat Kone AIMO Remastered
Software Windows 10 x64
Someone please explain the logic behind the overclocking/undervolting of this video card. I've seen many reviewers use a range from 500 MHz to 3000 MHz for RX 7800XT as a standard.

1. The card will not go above 3000 MHz even with well tuned undervolting - so why then use 5000 MHz just because the graph allows it to reach that number?
2. Why use 2800 MHz for minimum frequency when you negate your power savings? Does not 2800 MHz mean that the card will always be working at 100 % with no power saving?

The 2800 MHz low mark is particularly bewildering to me... It was not explained. Can someone explain the reviewed and observed differences in behavior of RX 7800XT working clock speeds when using TPU 2800/5000 MHz settings versus common 500/3000 MHz settings? I have not done any proper undervolting since 2017 when i had my RX Vega 56 .

I now struggle with my RX 7800XT - i can not get it stable below 1025 mV, though from what i see this card should run fine at 970 mV. SERIOUSLY WTF? Is my sample that bad?
Setting min frequency to say 2800MHz feels completely stupid to me, so I am puzzled by some of this as well.
The Adrenalin settings are confusing though, because from what I read the sliders aren't "real" settings but are in fact part of what defines some sort of curve you cannot adjust. No, I don't get it, so I don't mess with it, and only ever touch any of this when I want to downlock a card for lowering power consumption purposes by setting lower max frequency.
 
Joined
Jan 14, 2019
Messages
14,246 (6.42/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case It's not about size, but how you use it
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
Someone please explain the logic behind the overclocking/undervolting of this video card. I've seen many reviewers use a range from 500 MHz to 3000 MHz for RX 7800XT as a standard.

1. The card will not go above 3000 MHz even with well tuned undervolting - so why then use 5000 MHz just because the graph allows it to reach that number?
2. Why use 2800 MHz for minimum frequency when you negate your power savings? Does not 2800 MHz mean that the card will always be working at 100 % with no power saving?

The 2800 MHz low mark is particularly bewildering to me... It was not explained. Can someone explain the reviewed and observed differences in behavior of RX 7800XT working clock speeds when using TPU 2800/5000 MHz settings versus common 500/3000 MHz settings? I have not done any proper undervolting since 2017 when i had my RX Vega 56 .

I now struggle with my RX 7800XT - i can not get it stable below 1025 mV, though from what i see this card should run fine at 970 mV. SERIOUSLY WTF? Is my sample that bad?
Maximum doesn't mean what your card will reach, and minimum doesn't mean a hard limit. Your true value will be somewhere between the two. It's weird as heck, but it's only been getting weirder since GPU power saving was invented. Also, your game clock will depend on the game, as the card is stressed differently in different scenarios. Your card works on a power limit, not clock limit. So the key is maxing out the power limit while finding the highest clock that works with the lowest voltage. And people wonder why I don't bother.

Oh, and I do not recommend Furmark. It's a very single-sided test, it doesn't stress your whole card. I'd recommend a new game on full detail, Unigine Superposition, or 3DMark Time Spy instead.

Setting min frequency to say 2800MHz feels completely stupid to me, so I am puzzled by some of this as well.
The Adrenalin settings are confusing though, because from what I read the sliders aren't "real" settings but are in fact part of what defines some sort of curve you cannot adjust. No, I don't get it, so I don't mess with it, and only ever touch any of this when I want to downlock a card for lowering power consumption purposes by setting lower max frequency.
Exactly.
 
Joined
Aug 13, 2009
Messages
3,395 (0.60/day)
Location
Czech republic
Processor Ryzen 5800X
Motherboard Asus TUF-Gaming B550-Plus
Cooling Noctua NH-U14S
Memory 32GB G.Skill Trident Z Neo F4-3600C16D-32GTZNC
Video Card(s) Sapphire AMD Radeon RX 7900 XTX Nitro+
Storage HP EX950 512GB + Samsung 970 PRO 1TB
Display(s) Cooler Master GP27Q
Case Fractal Design Define R6 Black
Audio Device(s) Creative Sound Blaster AE-5
Power Supply Seasonic PRIME Ultra 650W Gold
Mouse Roccat Kone AIMO Remastered
Software Windows 10 x64
I read something about Furmark a couple days ago. Supposedly it's a good cooling quality test but not a good stress test. Somehow it produces surreal amounts of heat while not stressing the card fully so that instability might not be noticed.
It's also running on OpenGL or something, which nothing uses these days.
 
Joined
Jan 14, 2019
Messages
14,246 (6.42/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case It's not about size, but how you use it
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
I read something about Furmark a couple days ago. Supposedly it's a good cooling quality test but not a good stress test. Somehow it produces surreal amounts of heat while not stressing the card fully so that instability might not be noticed.
It's also running on OpenGL or something, which nothing uses these days.
It's one single graphical effect. It's a rolling furball. How people can think it's a thorough test of all your card's capabilities is a bit daft.
 
Joined
Oct 27, 2020
Messages
26 (0.02/day)
I read something about Furmark a couple days ago. Supposedly it's a good cooling quality test but not a good stress test. Somehow it produces surreal amounts of heat while not stressing the card fully so that instability might not be noticed.
It's also running on OpenGL or something, which nothing uses these days.

Furmark would be stressing modern GPUs fully if the power limit allowed them to.
The reason we even have GPU boost, and not a static clock, is because of Furmark.

I'm pretty sure that the 4090's 1000W XOC BIOS isn't enough for Furmark.

It's one single graphical effect. It's a rolling furball. How people can think it's a thorough test of all your card's capabilities is a bit daft.

It's purely shaders, meaning you're not bottlenecked by the TMUs, ROPs, or memory bandwidth. That's why it's producing ridiculous heat even today.
 
Joined
Jan 14, 2019
Messages
14,246 (6.42/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case It's not about size, but how you use it
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
It's purely shaders, meaning you're not bottlenecked by the TMUs, ROPs, or memory bandwidth. That's why it's producing ridiculous heat even today.
And that's why it's not indicative of how stable your card is.
 
Joined
Aug 13, 2009
Messages
3,395 (0.60/day)
Location
Czech republic
Processor Ryzen 5800X
Motherboard Asus TUF-Gaming B550-Plus
Cooling Noctua NH-U14S
Memory 32GB G.Skill Trident Z Neo F4-3600C16D-32GTZNC
Video Card(s) Sapphire AMD Radeon RX 7900 XTX Nitro+
Storage HP EX950 512GB + Samsung 970 PRO 1TB
Display(s) Cooler Master GP27Q
Case Fractal Design Define R6 Black
Audio Device(s) Creative Sound Blaster AE-5
Power Supply Seasonic PRIME Ultra 650W Gold
Mouse Roccat Kone AIMO Remastered
Software Windows 10 x64
Furmark would be stressing modern GPUs fully if the power limit allowed them to.
The reason we even have GPU boost, and not a static clock, is because of Furmark.

I'm pretty sure that the 4090's 1000W XOC BIOS isn't enough for Furmark.



It's purely shaders, meaning you're not bottlenecked by the TMUs, ROPs, or memory bandwidth. That's why it's producing ridiculous heat even today.
How do you mean?
I don't understand any of the tech stuff behind all this.
Can you elaborate?
 
Joined
Oct 27, 2020
Messages
26 (0.02/day)
And that's why it's not indicative of how stable your card is.

Because it doesn't load the TMUs, ROPs, or memory bandwidth much? That's just silly.
If a game is fillrate-limited by the TMUs or ROPs, then the game targets GPU architectures from 20+ years ago like Curie or R400.
 
Joined
Jan 14, 2019
Messages
14,246 (6.42/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case It's not about size, but how you use it
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
Because it doesn't load the TMUs, ROPs, or memory bandwidth much? That's just silly.
If a game is fillrate-limited by the TMUs or ROPs, then the game targets GPU architectures from 20+ years ago like Curie or R400.
It's a stable and consistent load. A scene in a game varies frame by frame, causing tiny inconsistencies. That's where your card's stability comes through.
 
Joined
Oct 27, 2020
Messages
26 (0.02/day)
How do you mean?
I don't understand any of the tech stuff behind all this.
Can you elaborate?

As GPUs have grown in size and capabilities, they are increasingly harder to feed properly. The RX 7900 XTX for example has 48x the number of CUs clocked at 2x the frequency of the Ryzen iGPU, yet it's only performing 45x as fast. what gives?

In most games, those 6144 shader units are doing nothing for huge chunks of time, same goes for the TMUs, ROPs, L2, and memory bus. This is because there are dependencies in the graphics calculations.
The lighting and shadow calculations can't begin before the G-buffer is complete when doing deferred rendering for example.
In addition there's no guarantee that all shaders will be large enough to saturate a warp. Any operation a GPU starts will engage at minimum a certain number of shader units (warp size), if it engages less than a warp size that just means some of the shader units are going idle.

Furmark does away with all of that, and just uses a simple pixel shader for everything, this eliminates dependences, register file bottlenecks, and allows all the shader units to do something "useful" for the entire render period. That also means that the entire GPU is 100% utilized and producing heat at the same time, which causes ridiculous power usage. You're looking at well over 1000W for a 3090.
Even back in the GTX 580 days (where the underutilization issue wasn't as pronounced), this caused ridiculous power spikes and actually fried VRMs.

The solution both Nvidia and AMD employed was to detect power virus loads like Furmark with dedicated circuitry, and forcibly throttle the GPUs to prevent meltdowns. This circuitry evolved and allowed GPU Boost to be introduced.

It's a stable and consistent load. A scene in a game varies frame by frame, causing tiny inconsistencies. That's where your card's stability comes through.
If you had the power supply, cooling, and VRM to load up Furmark on a 4090 at full boost clocks, I can guarantee you that Furmark would detect the issues.

The problem is that you'd need a 2500W power supply, liquid nitrogen, and a 100 50 power stage VRM to handle it.
 
Joined
Aug 13, 2009
Messages
3,395 (0.60/day)
Location
Czech republic
Processor Ryzen 5800X
Motherboard Asus TUF-Gaming B550-Plus
Cooling Noctua NH-U14S
Memory 32GB G.Skill Trident Z Neo F4-3600C16D-32GTZNC
Video Card(s) Sapphire AMD Radeon RX 7900 XTX Nitro+
Storage HP EX950 512GB + Samsung 970 PRO 1TB
Display(s) Cooler Master GP27Q
Case Fractal Design Define R6 Black
Audio Device(s) Creative Sound Blaster AE-5
Power Supply Seasonic PRIME Ultra 650W Gold
Mouse Roccat Kone AIMO Remastered
Software Windows 10 x64
I find it difficult to believe either company would care about one obscure stress program so much they would majorly change the design of their GPUs. Not saying it's not true, just weird, even though your post makes perfect sense.
 
Joined
Oct 27, 2020
Messages
26 (0.02/day)
I find it difficult to believe either company would care about one obscure stress program so much they would majorly change the design of their GPUs. Not saying it's not true, just weird, even though your post makes perfect sense.

Attached to GTX 580 are a series of power monitoring chips, which monitor the amount of power the card is drawing from the PCIe slot and PCIe power plugs. By collecting this information NVIDIA’s drivers can determine if the card is drawing too much power, and slow the card down to keep it within spec. This kind of power throttling is new for GPUs, though it’s been common with CPUs for a long time.

NVIDIA’s reasoning for this change doesn’t pull any punches: it’s to combat OCCT and FurMark. At an end-user level FurMark and OCCT really can be dangerous – even if they can’t break the card any longer, they can still cause other side-effects by drawing too much power from the PSU. As a result having this protection in place more or less makes it impossible to toast a video card or any other parts of a computer with these programs. Meanwhile at a PR level, we believe that NVIDIA is tired of seeing hardware review sites publish numbers showcasing GeForce products drawing exorbitant amounts of power even though these numbers represent non-real-world scenarios. By throttling FurMark and OCCT like this, we shouldn’t be able to get their cards to pull so much power. We still believe that tools like FurMark and OCCT are excellent load-testing tools for finding a worst-case scenario and helping our readers plan system builds with those scenarios in mind, but at the end of the day we can’t argue that this isn’t a logical position for NVIDIA.
 
Joined
Aug 13, 2009
Messages
3,395 (0.60/day)
Location
Czech republic
Processor Ryzen 5800X
Motherboard Asus TUF-Gaming B550-Plus
Cooling Noctua NH-U14S
Memory 32GB G.Skill Trident Z Neo F4-3600C16D-32GTZNC
Video Card(s) Sapphire AMD Radeon RX 7900 XTX Nitro+
Storage HP EX950 512GB + Samsung 970 PRO 1TB
Display(s) Cooler Master GP27Q
Case Fractal Design Define R6 Black
Audio Device(s) Creative Sound Blaster AE-5
Power Supply Seasonic PRIME Ultra 650W Gold
Mouse Roccat Kone AIMO Remastered
Software Windows 10 x64
So does this mean all cards are artificially performance limited? Because that's the only secondary conclusion I can draw out of this. What the hell is pulling too much power from a PSU? Can't they just implement power limit differently than kind of checking for two specific programs?
 
Joined
Oct 27, 2020
Messages
26 (0.02/day)
So does this mean all cards are artificially performance limited? Because that's the only secondary conclusion I can draw out of this. What the hell is pulling too much power from a PSU? Can't they just implement power limit differently than kind of checking for two specific programs?

They aren't artificially performance-limited at all, why would they be?
The hardware monitoring evolved to allow for GPU Boost with Kepler: https://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review/4
The Fermi approach of having drivers check for software names was deprecated ages ago.
 
Joined
Jan 14, 2019
Messages
14,246 (6.42/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case It's not about size, but how you use it
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
I find it difficult to believe either company would care about one obscure stress program so much they would majorly change the design of their GPUs. Not saying it's not true, just weird, even though your post makes perfect sense.
They care if that one obscure stress program leads to RMA requests and/or lawsuits.
 
Joined
Aug 13, 2009
Messages
3,395 (0.60/day)
Location
Czech republic
Processor Ryzen 5800X
Motherboard Asus TUF-Gaming B550-Plus
Cooling Noctua NH-U14S
Memory 32GB G.Skill Trident Z Neo F4-3600C16D-32GTZNC
Video Card(s) Sapphire AMD Radeon RX 7900 XTX Nitro+
Storage HP EX950 512GB + Samsung 970 PRO 1TB
Display(s) Cooler Master GP27Q
Case Fractal Design Define R6 Black
Audio Device(s) Creative Sound Blaster AE-5
Power Supply Seasonic PRIME Ultra 650W Gold
Mouse Roccat Kone AIMO Remastered
Software Windows 10 x64
Joined
Apr 30, 2011
Messages
2,732 (0.54/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
I ran Furmark. Both GPU-Z and HW Monitor reported an average of 0.96 Volts being delivered during the run, while AMD Adrenalin was set to 1.0 V. Furmark did not crash.

IMPORTANT!

Does this mean that Wizzard's fixed minimum voltage comparison table below of different RX 7800XT & RX 7700XT cards was based on GPU-Z rather than simply applied numbers in Adrenalin, MSI Afterburner, Rivatuner etc?
Anyone who needs to have stats about their GPU beyond thermals should stop using Furmark and use any banchmark that pushes the gaming section of the GPU at 100%. 3DMark Timespy, Unigine Valley, Superposition, etc. Furmark acts as a virus loader and both GPU vendors have implanted security measures against that to restrict the load on their GPUs in order for those to not take damage. So, Furmark is useless for years now.
 
Last edited:
Top