Saturday, November 13th 2010

Disable GeForce GTX 580 Power Throttling using GPU-Z

NVIDIA shook the high-end PC hardware industry earlier this month with the surprise launch of its GeForce GTX 580 graphics card, which extended the lead for single-GPU performance NVIDIA has been holding. It also managed to come up with some great performance per Watt improvements over the previous generation. The reference design board, however, made use of a clock speed throttling logic which reduced clock speeds when an extremely demanding 3D application such as Furmark or OCCT is run. While this is a novel way to protect components saving consumers from potentially permanent damage to the hardware, it does come as a gripe to expert users, enthusiasts and overclockers, who know what they're doing.

GPU-Z developer and our boss W1zzard has devised a way to make disabling this protection accessible to everyone (who knows what he's dealing with), and came up with a nifty new feature for GPU-Z, our popular GPU diagnostics and monitoring utility, that can disable the speed throttling mechanism. It is a new command-line argument for GPU-Z, that's "/GTX580OCP". Start the GPU-Z executable (within Windows, using Command Prompt or shortcut), using that argument, and it will disable the clock speed throttling mechanism. For example, "X:\gpuz.exe /GTX580OCP" It will stay disabled for the remainder of the session, you can close GPU-Z. It will be enabled again on the next boot.
As an obligatory caution, be sure you know what you're doing. TechPowerUp is not responsible for any damage caused to your hardware by disabling that mechanism. Running the graphics card outside of its power specifications may result in damage to the card or motherboard. We have a test build of GPU-Z (which otherwise carries the same-exact feature-set of GPU-Z 0.4.8). We also ran a power consumption test on our GeForce GTX 580 card demonstrating how disabling that logic affects power consumption.

DOWNLOAD: TechPowerUp GPU-Z GTX 580 OCP Test Build
Add your own comment

116 Comments on Disable GeForce GTX 580 Power Throttling using GPU-Z

#27
claylomax
T3RM1N4L D0GM4I heard W1z is magic.... isn't it?
W1z is short for W1zzard, the administrator of the Techpowerup website. Wizzard was a 1970's rock UK band, one of their hits is popular this time of year here in the UK: www.youtube.com/watch?v=ZoxQ4Ul_DME
Posted on Reply
#28
W1zzard
T3RM1N4L D0GM4I heard W1z is magic.... isn't it?
but don't tell anyone or they might tax it
Posted on Reply
#29
HTC
I just have a couple of questions:

- Are there any performance gains when not limiting?

- Is it worth it to remove the limiter?
Posted on Reply
#30
LAN_deRf_HA
This isn't going to burn anything out. The best use of this would be 15-20 minute stress testing sessions to ensure your overclock stability. Even doing it a dozen times isn't going to hurt anything, and you're unlikely to need to do it any more often than that.

Funny though, 350w makes it seem like the card isn't anymore power efficient at all.
Posted on Reply
#31
T3RM1N4L D0GM4
W1zzardbut don't tell anyone or they might tax it
Sure, it will be our little secret :pimp:

Back in topic: nice performance/watt ratio for this 580, rly :laugh: compared to 'old' gtx480... 480 has no power throttling cheat, right?
Posted on Reply
#32
a_ump
So basically, Nvidia implemented a 2nd throttle at the software level to make power consumption level's of the GTX 580 look lower? that's what i'm getting out of this. Course, we need to wait and see what results other users of the GTX 580's get.
Posted on Reply
#33
segalaw19800
Wonder if you RMA a burn card will they know that power throttling was disable??? :o
Posted on Reply
#34
Hayder_Master
great w1zzard nice work, i don't like crappy card without overclocking
Posted on Reply
#36
Steevo
Nvidia; They didn;t like wood screws. So we foudn another use for them.



We put a wood block under the throttle.


Posted on Reply
#37
wiak
be prepared to destroy your PSU :P
Posted on Reply
#38
the54thvoid
Super Intoxicated Moderator
a_umpSo basically, Nvidia implemented a 2nd throttle at the software level to make power consumption level's of the GTX 580 look lower? that's what i'm getting out of this. Course, we need to wait and see what results other users of the GTX 580's get.
I read somewhere it's a software implementation just now, only for Furmark and OCCT. So it shouldn't be active with anything else. Did i read this right?
Posted on Reply
#39
Bundy
W1zzardthis should not affect temperature protection, which will remain at 97°C



it won't. just as much as any card other than gtx 580 does not have this kind of protection either
Is that temperature protection based on multiple sensors (GPU,VRM) or GPU only? If the power limiting is intended to protect the vrm rather than GPU, this mod may push users cards closer to failure than what they expect. If the protection is aimed at the GPU, then the temp limit will work ok.

As was advised, user beware is the relevant issue here.
Posted on Reply
#40
HillBeast
qubitTrue, but no other previous cards have used as much power as the 480 & 580, which makes a burnout more likely.
What everyone fails to remember is that supercomputers (especially ones based on the more modern POWER chips (POWER6 and POWER7) have very powerful and very hot CPUs in them. They draw well over 300W each and those puppies are at full load for months and months on end. Yes they are designed to handle it, but if a card is designed correctly, it can easily handle being at full load.

This throttling system wasn't NVIDIA saying the cards can't go higher than what they rated them for, it's NVIDIA just trying to make the card look like it's not as high of a power hungry card.
Posted on Reply
#41
Imsochobo
HillBeastWhat everyone fails to remember is that supercomputers (especially ones based on the more modern POWER chips (POWER6 and POWER7) have very powerful and very hot CPUs in them. They draw well over 300W each and those puppies are at full load for months and months on end. Yes they are designed to handle it, but if a card is designed correctly, it can easily handle being at full load.

This throttling system wasn't NVIDIA saying the cards can't go higher than what they rated them for, it's NVIDIA just trying to make the card look like it's not as high of a power hungry card.
my server board runs 130 W cpu's and has two phases each. no biggo coolin on the pwm either.
8 cpu board.

So your 16 phases.. do you need them ? i've taken world records on 4! average joe doesnt need so many phases, all that crap.
Motherboards can be cheap, the stock performance rarely differ, overclocking on the otherhand, you may require some expensiveness, and add cf and sli.

Back on topic, its funny tho.
350 W
ati manages to push a dualgpu card, with the lowered per/watt due to scaling, and having a double set of memory one of them doing nothing, and yet having better perf/watt.
I hope ati's engineers are getting a little bonus!
Posted on Reply
#42
qubit
Overclocked quantum bit
HillBeastWhat everyone fails to remember is that supercomputers (especially ones based on the more modern POWER chips (POWER6 and POWER7) have very powerful and very hot CPUs in them. They draw well over 300W each and those puppies are at full load for months and months on end. Yes they are designed to handle it, but if a card is designed correctly, it can easily handle being at full load.

This throttling system wasn't NVIDIA saying the cards can't go higher than what they rated them for, it's NVIDIA just trying to make the card look like it's not as high of a power hungry card.
That would be nice if it's true, but I don't think nvidia would spend money implementing and building in a performance limiting feature (frame rate drops when it kicks in) just to put a certain idea in people's minds.

As W1zzard said to me earlier and you did just now, the card can consume any amount of power and run just fine with it, as long as the power circuitry and the rest is designed for it.

And that's the rub.

Everything is built to a price. While those POWER computers are priced to run flat out 24/7 (and believe me they really charge for this capability) a power-hungry consumer grade item, including the expensive GTX 580 is not. So, the card will only gobble huge amounts of power for any length of time when an enthusiast overclocks it and runs something like FurMark on it. Now, how many of us do you think there are to do this? Only a tiny handful. Heck, even out of the group of enthusiasts, only some of them will ever bother to do this. The rest of us (likely me included) are happy to read the articles about it and avoid unecessarily stressing out their expensive graphics cards. I've never overclocked my current GTX 285 for example. I did overclock my HD 2900 XT though.

The rest of the time, the card will be either sitting at the desktop (hardly taxing) or running a regular game at something like 1920x1080, which won't stress it anywhere near this amount. So nvidia are gonna build it to withstand this average stress reliably. Much more and reliability drops significantly.

The upshot, is that they're gonna save money on the quality of the motherboard used for the card, it's power components and all the other things that would take the strain when it's taxed at high power. This means that Mr Enthusiast over here at TPU is gonna kill his card rather more quickly than nvidia would like and generate lots of unprofitable RMAs. Hence, they just limit the performance and be done with it. Heck, it also helps to guard against the clueless wannabe enthusiast that doesn't know what he's doing and maxes the card out in a hot, unventilated case. ;)

Of course, now that there's a workaround, some enthusiasts are gonna use it...

And dammit, all this talk of GTX 580s is really making me want one!! :)
Posted on Reply
#43
lism
So basicly the card is capped at a certain level of power usage, but will it increase performance in furmark as soon as this trigger is being set off?

Or is just just abnormal power usage by the VRM's to protect them from burning out ? A few GTX's where also reported with fried VRM's using Furmark.
Posted on Reply
#44
qubit
Overclocked quantum bit
lismSo basicly the card is capped at a certain level of power usage, but will it increase performance in furmark as soon as this trigger is being set off?

Or is just just abnormal power usage by the VRM's to protect them from burning out ? A few GTX's where also reported with fried VRM's using Furmark.
Performance will increase noticeably as soon as the cap is removed.

EDIT: You might also want to read my post, the one before yours that explains why this kind of limiter is being put in.
Posted on Reply
#45
Wile E
Power User
qubitThat would be nice if it's true, but I don't think nvidia would spend money implementing and building in a performance limiting feature (frame rate drops when it kicks in) just to put a certain idea in people's minds.

As W1zzard said to me earlier and you did just now, the card can consume any amount of power and run just fine with it, as long as the power circuitry and the rest is designed for it.

And that's the rub.

Everything is built to a price. While those POWER computers are priced to run flat out 24/7 (and believe me they really charge for this capability) a power-hungry consumer grade item, including the expensive GTX 580 is not. So, the card will only gobble huge amounts of power for any length of time when an enthusiast overclocks it and runs something like FurMark on it. Now, how many of us do you think there are to do this? Only a tiny handful. Heck, even out of the group of enthusiasts, only some of them will ever bother to do this. The rest of us (likely me included) are happy to read the articles about it and avoid unecessarily stressing out their expensive graphics cards. I've never overclocked my current GTX 285 for example. I did overclock my HD 2900 XT though.

The rest of the time, the card will be either sitting at the desktop (hardly taxing) or running a regular game at something like 1920x1080, which won't stress it anywhere near this amount. So nvidia are gonna build it to withstand this average stress reliably. Much more and reliability drops significantly.

The upshot, is that they're gonna save money on the quality of the motherboard used for the card, it's power components and all the other things that would take the strain when it's taxed at high power. This means that Mr Enthusiast over here at TPU is gonna kill his card rather more quickly than nvidia would like and generate lots of unprofitable RMAs. Hence, they just limit the performance and be done with it. Heck, it also helps to guard against the clueless wannabe enthusiast that doesn't know what he's doing and maxes the card out in a hot, unventilated case. ;)

Of course, now that there's a workaround, some enthusiasts are gonna use it...

And dammit, all this talk of GTX 580s is really making me want one!! :)
I don't think the limit is there to protect the card. I think it is there to make power consumption numbers look better, and to allow them to continue to claim compliance with PCI-SIG specs for PCIe power delivery.
Posted on Reply
#46
HillBeast
Imsochobomy server board runs 130 W cpu's and has two phases each. no biggo coolin on the pwm either.
8 cpu board.

So your 16 phases.. do you need them ? i've taken world records on 4! average joe doesnt need so many phases, all that crap.
Motherboards can be cheap, the stock performance rarely differ, overclocking on the otherhand, you may require some expensiveness, and add cf and sli.
What on earth are you on about with 16 power phases for? I was talking about POWER7: the IBM PowerPC CPU used in supercomputers. Those are VERY power hungry chips. Your 130W server chip wouldn't compare to the performance these things provide and power these things need. Why did you quote me for when you weren't even remotely talking on the same topic as me?

And I don't have 16 power phases on my motherboard if that is what you were referring to. Power phases mean nothing. It's how they are implemented. My Gigabyte X58A-UD3R with 8 analog power phases can overclock HIGHER than my friends EVGA X58 Classified with 10 digital power phases.
Wile EI don't think the limit is there to protect the card. I think it is there to make power consumption numbers look better, and to allow them to continue to claim compliance with PCI-SIG specs for PCIe power delivery.
Exactly.
Posted on Reply
#47
qubit
Overclocked quantum bit
Wile EI don't think the limit is there to protect the card. I think it is there to make power consumption numbers look better, and to allow them to continue to claim compliance with PCI-SIG specs for PCIe power delivery.
Hmmm... the compliance angle sounds quite plausible. Does anyone have inside info on why nvidia implemented this throttle?

The built to a price argument still stands though.
Posted on Reply
#48
Wile E
Power User
qubitHmmm... the compliance angle sounds quite plausible. Does anyone have inside info on why nvidia implemented this throttle?

The built to a price argument still stands though.
Not really, because the boards and power phases are more than enough to support the power draw of the unlocked cards. We already know what the components are capable of, and they are more than enough for 350W of draw.

If anything, adding throttling has added to the price of the needed components.
Posted on Reply
#49
newtekie1
Semi-Retired Folder
What amazes me is how many people think this is some major limitter that will hinder performance or kick in when the card goes over a certainly current level.

It is software based, it detects OCCT and Furmark and that is it. It will not effect any other program at all. Anyone remember ATi doing this with their drivers so that Furmark wouldn't burn up their cards?
qubitTrue, but no other previous cards have used as much power as the 480 & 580, which makes a burnout more likely.
Ummmm...there most certainly has been.
Wile EI don't think the limit is there to protect the card. I think it is there to make power consumption numbers look better, and to allow them to continue to claim compliance with PCI-SIG specs for PCIe power delivery.
I really have a hard time believing that they did it to make power consumption look better. Any reviewer right away should pick up on the fact that under normal gaming load the card is consuming ~225w and under Furmark it is only consuming ~150w. Right there it should throw up a red flag, because Furmark consumption should never be drastically lower, or lower at all, than normal gaming numbers. Plus the performance different in Furmark would be pretty evident to a reviewer that sees Furmark performance numbers daily. Finally, with the limitter turned off the power consumption is still lower, and for a Fermi card that is within 5% of the HD5970 to have pretty much the same power consumption, that is an impressive feat that doesn't need to be artificially enhanced.
Posted on Reply
#50
KainXS
I have been wondering, when non reference versions of the GTX580 start coming out whats the chance that some of those non reference card don't even use the chips to throttle and run at full blast.
Posted on Reply
Add your own comment
Nov 21st, 2024 04:54 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts