Saturday, November 13th 2010

Disable GeForce GTX 580 Power Throttling using GPU-Z

NVIDIA shook the high-end PC hardware industry earlier this month with the surprise launch of its GeForce GTX 580 graphics card, which extended the lead for single-GPU performance NVIDIA has been holding. It also managed to come up with some great performance per Watt improvements over the previous generation. The reference design board, however, made use of a clock speed throttling logic which reduced clock speeds when an extremely demanding 3D application such as Furmark or OCCT is run. While this is a novel way to protect components saving consumers from potentially permanent damage to the hardware, it does come as a gripe to expert users, enthusiasts and overclockers, who know what they're doing.

GPU-Z developer and our boss W1zzard has devised a way to make disabling this protection accessible to everyone (who knows what he's dealing with), and came up with a nifty new feature for GPU-Z, our popular GPU diagnostics and monitoring utility, that can disable the speed throttling mechanism. It is a new command-line argument for GPU-Z, that's "/GTX580OCP". Start the GPU-Z executable (within Windows, using Command Prompt or shortcut), using that argument, and it will disable the clock speed throttling mechanism. For example, "X:\gpuz.exe /GTX580OCP" It will stay disabled for the remainder of the session, you can close GPU-Z. It will be enabled again on the next boot.
As an obligatory caution, be sure you know what you're doing. TechPowerUp is not responsible for any damage caused to your hardware by disabling that mechanism. Running the graphics card outside of its power specifications may result in damage to the card or motherboard. We have a test build of GPU-Z (which otherwise carries the same-exact feature-set of GPU-Z 0.4.8). We also ran a power consumption test on our GeForce GTX 580 card demonstrating how disabling that logic affects power consumption.

DOWNLOAD: TechPowerUp GPU-Z GTX 580 OCP Test Build
Add your own comment

116 Comments on Disable GeForce GTX 580 Power Throttling using GPU-Z

#76
MikeX
geez, you guys are killing the planet. 120 fps aint enough? :P
Posted on Reply
#77
MrHydes
GTX580 it's not quite what they anounced,

i was amazed when reviews pointed less power consumption and about more

10% ~20% in some cases (against GTX480)... well now we all know that's not true!
Posted on Reply
#78
newtekie1
Semi-Retired Folder
qubitYeah, watercooling definitely sounds like a good idea for this.

Ya know, I think I read somewhere (was it on TPU?) that the throttle is there to also protect the mobo, as well as the card. However, I don't quite understand why motherboard damage could happen: the PCI-E slot is rated for 75W, so the card will simply pull a max of 75W from there, in order to stay PCI-E compliant and the rest through its power connectors, therefore the risk to the mobo shouldn't be there.

Anyone have the definitive answer to this one?
slyfox2151+1

WTF.


how would the GTX5xx dmg the motherboard?
24 Pin P1 Connector Wires getting extremely hot

That is what can happen if you overload the PCI-e slots. Now that was an extreme case of course, but once you start pulling more than 75w through the PCI-E connector things can get hairy pretty quickly.
Posted on Reply
#79
slyfox2151
yes thats true.. but he was running more then 1 card.


is the slot/card not designed to stop it from sending more then 75watts through it?
Posted on Reply
#80
qubit
Overclocked quantum bit
newtekie124 Pin P1 Connector Wires getting extremely hot

That is what can happen if you overload the PCI-e slots. Now that was an extreme case of course, but once you start pulling more than 75w through the PCI-E connector things can get hairy pretty quickly.
Thanks NT - that's quite a nasty burn on that connector there.

But my point is that wouldn't the card limit its power draw to stay within that limit and pull the rest from it's power connectors? That would prevent any damage to the mobo and stay PCI-E standards compliant. I don't know if it would, which is why I'm throwing the question out to the community.
Posted on Reply
#81
newtekie1
Semi-Retired Folder
slyfox2151yes thats true.. but he was running more then 1 card.


is the slot/card not designed to stop it from sending more then 75watts through it?
Not really, it will attempt to send as much as is demanded of it.
qubitThanks NT - that's quite a nasty burn on that connector there.

But my point is that wouldn't the card limit its power draw to stay within that limit and pull the rest from it's power connectors? That would prevent any damage to the mobo and stay PCI-E standards compliant. I don't know if it would, which is why I'm throwing the question out to the community.
That is pretty much the idea behind this limit. The PCI-E slot provides 75w, a 6-pin PCI-E power connector provies 75w, and an 8-pin PCI-E power connector provides 150w. That is 300w. So once you go over that, it doesn't matter if the power is coming from the PCI-E power connectors or the motherboard's PCI-E slot, you are overloading something somewhere, and you aren't PCI-E standards compliant.
Posted on Reply
#82
qubit
Overclocked quantum bit
Sure something would go pop, but it still doesn't answer the question if the card would pull more than 75W from the mobo under such a condition. Properly designed, it should limit the current. I just don't know if it does or not and I don't think anyone else does either.
Posted on Reply
#83
newtekie1
Semi-Retired Folder
qubitSure something would go pop, but it still doesn't answer the question if the card would pull more than 75W from the mobo under such a condition. Properly designed, it should limit the current. I just don't know if it does or not and I don't think anyone else does either.
W1z might know if he has power consumption numbers from just the PCI-E slot.

However, if you assume pretty even load across all the connectors, 1/4 from the PCI-E slot, 1/4 from the PCI-E 6-pin, and 1/2 from the PCI-E 8-Pin, once the power consumption goes over 300w, the extra will be divided between all the connectors supplying power. I don't believe the power curcuits on video cards are smart enough to know that once the power consumption goes over a certain level to load certain connectors more than others.
Posted on Reply
#84
qubit
Overclocked quantum bit
Thanks NT, that sounds quite likely. And because of this limitation, I'll bet that's why the current limiter operates the way it does.

W1zz, you wanna give us the definitive answer on this one?
Posted on Reply
#85
MikeMurphy
qubitYeah, watercooling definitely sounds like a good idea for this.

Ya know, I think I read somewhere (was it on TPU?) that the throttle is there to also protect the mobo, as well as the card. However, I don't quite understand why motherboard damage could happen: the PCI-E slot is rated for 75W, so the card will simply pull a max of 75W from there, in order to stay PCI-E compliant and the rest through its power connectors, therefore the risk to the mobo shouldn't be there.

Anyone have the definitive answer to this one?
People privy to inside information, and who are smarter than you and I, decided it was necessary.

I suspect the tech specs published to manufacturers didn't account for the unusual power consumption under furmark etc. This wouldn't have been an accident, but rather a procedure to keep costs down re power circuits and cooling.
Posted on Reply
#86
Steevo
I don't believe the logic exists for that either, I believe some cards pull the memory and other power through the PCIe slot and the core power through he connectors. I hope that is how they have the 580 setup.
Posted on Reply
#87
bakalu
HTC@ bakalu: Any chance you could rename the EXE Furmark to whatever you like and run it again with your 580? If @ anytime you see the temp rising too much, please interrupt the program but do post a screenie after.
Can you answer my question?

You buy the GTX 580 to play games or run Furmark ?
Posted on Reply
#88
HTC
bakaluCan you answer my question?

You buy the GTX 580 to play games or run Furmark ?
Neither: i don't buy it.

Took you a long time to reply but no matter. Since i asked, W1zzard has stated that the card really does react to Furmark and OCCT and, as such, what i asked is now irrelevant.
Posted on Reply
#89
bakalu
HTCNeither: i don't buy it.

Took you a long time to reply but no matter. Since i asked, W1zzard has stated that the card really does react to Furmark and OCCT and, as such, what i asked is now irrelevant.
I bought the GTX 580 to play games so I dont' care the temperature of the GTX 580 when running Furmark

The temperature of the GTX 580 when playing is very cool and that is what interests me.
Posted on Reply
#90
MikeMurphy
bakaluI bought the GTX 580 to play games so I dont' care the temperature of the GTX 580 when running Furmark

The temperature of the GTX 580 when playing is very cool and that is what interests me.
OK, if you don't care about the topic of this thread then please don't post in this thread. This isn't a sneer remark or anything but just a way to get this thing back on topic.

Thanks,
Posted on Reply
#91
Mussels
Freshwater Moderator
qubitYeah, watercooling definitely sounds like a good idea for this.

Ya know, I think I read somewhere (was it on TPU?) that the throttle is there to also protect the mobo, as well as the card. However, I don't quite understand why motherboard damage could happen: the PCI-E slot is rated for 75W, so the card will simply pull a max of 75W from there, in order to stay PCI-E compliant and the rest through its power connectors, therefore the risk to the mobo shouldn't be there.

Anyone have the definitive answer to this one?
the same way sticking a wire from your 12V rail onto the metal of your case makes shit melt. excess power use will simply cause shit to fry.
Posted on Reply
#92
HTC
bakaluI bought the GTX 580 to play games so I dont' care the temperature of the GTX 580 when running Furmark
Really? Funny because the first thing you posted on this thread was ...
bakaluMaximum Temp with Furmark - 70oC
forum.amtech.com.vn/attachments/card-do-hoa-video-cards/24475d1289719381-review-amtech-asus-geforce-gtx-580-da-co-mat-o-amtech-temp-70.jpg

Maximum Power Consumption of Core i7 965 @ 3.6GHz + ASUS GTX 480 when running Furmark
forum.amtech.com.vn/attachments/card-do-hoa-video-cards/20213d1274254287-amtech-review-asus-geforce-gtx-480-bai-binh-phuc-han-gtx480-full-load.jpg

Maximum Power Consumption of Core i7 965 @ 3.6GHz + ASUS GTX 580 when running Furmark
forum.amtech.com.vn/attachments/card-do-hoa-video-cards/24364d1289471892-review-amtech-asus-geforce-gtx-580-da-co-mat-o-amtech-cs-peak-chay-furmark.jpg
bakaluThe temperature of the GTX 580 when playing is very cool and that is what interests me.
If you say so ...
Posted on Reply
#93
BorgOvermind
So the card actually reaches above 360W. Just as I anticipated. If they would of let it unleashed it would of exceeded the 300W PCI-E specs limit. Good thing it can unlock easy tho'.
Posted on Reply
#94
Imsochobo
newtekie1Not really, it will attempt to send as much as is demanded of it.



That is pretty much the idea behind this limit. The PCI-E slot provides 75w, a 6-pin PCI-E power connector provies 75w, and an 8-pin PCI-E power connector provides 150w. That is 300w. So once you go over that, it doesn't matter if the power is coming from the PCI-E power connectors or the motherboard's PCI-E slot, you are overloading something somewhere, and you aren't PCI-E standards compliant.
PCI-E can give more than 75 W
PCi-e 1.1 can only give 75W 2.0 can give more than 75. 150 if i recall right...
Posted on Reply
#95
bogie
I could do with a stop throttling tool for my HD5870 as when i watch films on my secondary display it throttles down and causes stuttering playback.

Will it work on the HD5870 as well?
Posted on Reply
#96
W1zzard
bogieWill it work on the HD5870 as well?
no. this is only for the gtx 580 power throttling which is a unique mechanism at this time that no other card before ever used
Posted on Reply
#97
BorgOvermind
ImsochoboPCI-E can give more than 75 W
PCi-e 1.1 can only give 75W 2.0 can give more than 75. 150 if i recall right...
I'm not 100% sure, but I don't think so. A card expecting 150W from the slot would not run on a 1.0 slot, but the compatibility is 100%. The additional W come from the external connectors.
75W+2x75W from 2x6Pins makes 225. 8-pins are used if power drain is larger then 225W.
A PCI-E with 150W from slot could get to 350+ with the extra 8-pins, which is not the case.
Posted on Reply
#98
Mussels
Freshwater Moderator
BorgOvermindI'm not 100% sure, but I don't think so. A card expecting 150W from the slot would not run on a 1.0 slot, but the compatibility is 100%. The additional W come from the external connectors.
75W+2x75W from 2x6Pins makes 225. 8-pins are used if power drain is larger then 225W.
A PCI-E with 150W from slot could get to 350+ with the extra 8-pins, which is not the case.
^ you're correct on the wattages.


putting what i said in simpler terms:

drawing more than 75W from the slot wont magically turn the slot off, or anything else like that... if the card has no internal mechanism to deal with the power draw, the wiring feeding the slot will just start to overheat, and bad things can happen.
Posted on Reply
#99
newtekie1
Semi-Retired Folder
ImsochoboPCI-E can give more than 75 W
PCi-e 1.1 can only give 75W 2.0 can give more than 75. 150 if i recall right...
I thought so as well, until I was kindly pointed to the PCIsig document explaining PCI-E 2.0 specs:

www.pcisig.com/developers/main/training_materials/get_document?doc_id=b590ba08170074a537626a7a601aa04b52bc3fec

Page 38 is the important one estabilishing how much power can be drawn from where, the slot is still limitted to 75w.
Posted on Reply
#100
qubit
Overclocked quantum bit
Mussels^ you're correct on the wattages.


putting what i said in simpler terms:

drawing more than 75W from the slot wont magically turn the slot off, or anything else like that... if the card has no internal mechanism to deal with the power draw, the wiring feeding the slot will just start to overheat, and bad things can happen.
You know, it's really beginning to look like we need an uprated power spec for ATX & PCI-E power delivery. I'm sure we're hitting the same brick wall that CPUs did a few years ago, which is why these latest gen cards are getting held back in the performance they deliver and are not really massively faster than the ones that they replace. I'll bet the new Cayman GPU will have a lot of the same power and heat issues as nvidia's Fermi GPU.

I'm sure that if a 600W power budget (with appropriate cooling) was available, some excellent performance gains could be achieved, like double or more performance and extra rendering features.
Posted on Reply
Add your own comment
Nov 21st, 2024 04:41 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts