# Underclock too much could damage video card?



## Derek12 (Nov 16, 2010)

I would want to underclock my video card to the lowest core/shader clock values while I'm not gaming (162MHz/800MHz). Could this be bad for it? I've tried and I don't notice great performance penalties in 2D but I'm not sure if could damage the card so I returned to the default values until I comfirm this underclock is safe.
Using EasyTuner 6.
Many thanks.


----------



## toastem2004 (Nov 16, 2010)

If your still putting "stock" voltage through it, then maybe. With the lowering of the speeds, lowing of the voltage is also recommended. just as when overclocking more voltage is usually required. A good example of this is Cool n quiet. On a Turion ml-32 @1.8GHz, voltage is 1.35 & when it down clocks to 800MHz, voltage is now 0.925


----------



## newtekie1 (Nov 16, 2010)

It will be completely safe, though it won't do a whole lot and there isn't much reason for it.  Now if you lowered the clocks and the volage that would save you some power, but just lowering the clocks won't save much at all.


----------



## Derek12 (Nov 17, 2010)

OK Many thanks, my card is factory-overclocked, then I will simply remove that overclock.

BTW  Easy Tune 6 doesn't let me to alter the GPU voltages only the Shader/core/memory clocks
Many thanks


----------



## hat (Nov 17, 2010)

I wouldn't worry about it. You have a 9500gt... you're not going to save very much power no matter what you do with it.


----------



## Derek12 (Nov 17, 2010)

hat said:


> I wouldn't worry about it. You have a 9500gt... you're not going to save very much power no matter what you do with it.



Why?
Thanks


----------



## Bo$$ (Nov 17, 2010)

Derek12 said:


> Why?
> Thanks



it doesnt use too much in the first place


----------



## Tatty_One (Nov 17, 2010)

As has been mentioned, the 9500GT is in reality a low power consuming GPu in any case, I think the default voltage to the GPU is 1.00V and many 9500's don't actually have different 2D and 3D clocks because  simply there is little to gain from the savings in the first place, I think idle draw on the overclocked version is only around 95W.

Of course you can set/lower 2D clocks and voltage via a bios flash, just make sure you don't get over ambitious with lowering the voltage, even in 2D many GPU's have a minimum operating voltage and if you go too low, either on clocks or voltage the card will begin to stutter, much like a car that pulls away without sufficient foot being applied to the accelerator, you know when it does the "donkey" jerks often seen with new drivers because it aint getting enough gas and/or revs


----------



## hat (Nov 17, 2010)

Tatty_One said:


> I think idle draw on the overclocked version is only around 95W.



I don't think that's right. 95w idle is horrific for any card, even a 5970.


----------



## qubit (Nov 17, 2010)

Underclocking your card will definitely not damage it, as others have stated here. Whether it's worth doing on your low power card is another matter.

Note that as you reduce the clock speed, you will notice a slowdown even in desktop 2D performance and if you go down low enough, it will malfunction when trying to display high resolutions, but it won't be damaged.

This partial loss of functionality will also occur if you lower the voltage too much.

Note that the Aero interface actually keeps the card in 3D mode to render its effects.


----------



## Tatty_One (Nov 17, 2010)

hat said:


> I don't think that's right. 95w idle is horrific for any card, even a 5970.



I think it is........................

http://www.techpowerup.com/reviews/Galaxy/GeForce_9500_GT_Overclocked/24.html

thats for the galaxy, i was looking earlier at the EVGA OC version which clocks a little higher and therefore is 95W, remember these cards are 65nm (Rev 1 release, later releases went 55nm) process NOT 40nm like most of the more modern cards........ so i suppose it would be more accurate to say, "they are relatively low power consumption cards for the 65nm process".  if you compare it with the GTX 260 65nm then you are seeing an additional 20W consumption at idle for the 260, that accelerates to hhhuuuggggeeee amounts at load and max.


----------



## the54thvoid (Nov 17, 2010)

Good god.  To think all this time we've been moaning about power draw of the gtx 480's.  Puts it into perspective.


----------



## Black Panther (Nov 17, 2010)

hat said:


> I don't think that's right. 95w idle is horrific for any card, even a 5970.



5970 draws only 39W @ idle 

(the maximum would be another story... say double the 150W of the 9500GT's maximum)

On topic though, I'd say if the OP manages to reduce his idle power consumption it isn't a bad idea.


----------



## Disparia (Nov 17, 2010)

I was able to reduce the draw from the wall by about 10w from adjusting voltage/clocks of my old 8600GTS. Only had two options for voltages, so I only gained so much.

Now that I think of it, one my frankenboxes has a 9500GT in it. Time for tests!


----------



## Derek12 (Nov 17, 2010)

Many thanks for your answers
I want to underclock it to make my system noiseless and to reduce electrical bill, I enabled Cool and Quiet and Smart Fan (a year and a half that my computer has and I didn't know this feature and the noise level decreased).
The next step is the video card. I know my MB has fanless onboard video but it is very bad for games and videos and stealts my limited RAM memory so I decided to put a dedicated one.
The next step would be a SSD drive, a VERY long-term goal 
BTW I have played CoD 4 MW with 500 MHz and 1350 MHz shader and the GPU temp didn't go above 40ºC so I think the underclocking is reducing the temps (and maybe power consumption)


----------



## HalfAHertz (Nov 17, 2010)

Tatty_One said:


> I think it is........................
> 
> http://www.techpowerup.com/reviews/Galaxy/GeForce_9500_GT_Overclocked/24.html
> 
> thats for the galaxy, i was looking earlier at the EVGA OC version which clocks a little higher and therefore is 95W, remember these cards are 65nm (Rev 1 release, later releases went 55nm) process NOT 40nm like most of the more modern cards........ so i suppose it would be more accurate to say, "they are relatively low power consumption cards for the 65nm process".  if you compare it with the GTX 260 65nm then you are seeing an additional 20W consumption at idle for the 260, that accelerates to hhhuuuggggeeee amounts at load and max.



You have to remember that those results are old and show the power for the entire PC


----------



## Tatty_One (Nov 17, 2010)

HalfAHertz said:


> You have to remember that those results are old and show the power for the entire PC



In our reviews it shows card only draw, not whole system draw.


----------



## Derek12 (Nov 18, 2010)

BTW Tatty_One the 95 W is the whole computer idle consumption (The table says "*System *power consumption - idle ", the card's idle consumption is 20 W and the load one is 40 W
The article also says *"In order to characterize a video card's power consumption, the whole system's mains power draw was measured. This means that these numbers include CPU, Memory, HDD, Video card and PSU inefficiency."*


----------



## hat (Nov 18, 2010)

Yeah, I would buy 95w for entire system idle power draw, but not for the card alone. The 9500GT doesn't have a power connector, so it can draw at most 75w from the slot. It's just not possible for it to idle at 95w... that could possibly break the board, nevermind the power it would draw at load if 95w was indeed it's idle power draw.


----------



## Tatty_One (Nov 18, 2010)

Derek12 said:


> BTW Tatty_One the 95 W is the whole computer idle consumption (The table says "*System *power consumption - idle ", the card's idle consumption is 20 W and the load one is 40 W
> The article also says *"In order to characterize a video card's power consumption, the whole system's mains power draw was measured. This means that these numbers include CPU, Memory, HDD, Video card and PSU inefficiency."*



Ahhhh right, my apologies, I didnt see how that could possibly be full system draw at idle seeing as that review had an E8400 overclocked to 3.6 gig, now the TDP of a stock E8400 is 65W, so the review chip was overclocked by 20%, that knocks the TDP up to around 80W, now if you add 20W for the card (100W), throw in a motherboard, a HDD, some ram and fans then clearly my math is bad because i don't see for the life of me how that equals 91W.


----------



## heky (Nov 18, 2010)

@Tatty_One
Its in idle, an E8400 doesnt consume 65W at idle, even overclocked. At load yeah, but not idle.


----------



## Tatty_One (Nov 18, 2010)

heky said:


> @Tatty_One
> Its in idle, an E8400 doesnt consume 65W at idle, even overclocked. At load yeah, but not idle.



Good point, of course thats "max" TDP rating, my bad, i will go back to sleep now!   even less reason for the op to bother risking a flash for a card that consummes so little TBH.


----------



## Mussels (Nov 18, 2010)

Tatty_One said:


> Ahhhh right, my apologies, I didnt see how that could possibly be full system draw at idle seeing as that review had an E8400 overclocked to 3.6 gig, now the TDP of a stock E8400 is 65W, so the review chip was overclocked by 20%, that knocks the TDP up to around 80W, now if you add 20W for the card (100W), throw in a motherboard, a HDD, some ram and fans then clearly my math is bad because i don't see for the life of me how that equals 91W.



you have now been learneded.


Derek12:

unlike your CPU with cool and quiet, there is no guarantee that lowering the temps on your video card will actually decrease noise. the fan could well be hardwired and NOT temperature controlled at all, so lowering temps makes it stay just as loud.


Perhaps look into a replacement GPU cooler instead.


----------



## qubit (Nov 18, 2010)

Mussels said:


> Derek12:
> 
> unlike your CPU with cool and quiet, there is no guarantee that lowering the temps on your video card will actually decrease noise. the fan could well be hardwired and NOT temperature controlled at all, so lowering temps makes it stay just as loud.
> 
> ...



Or the silly fan could perhaps make the perfect "business case" for a brand new _better_ graphics card.


----------



## Derek12 (Nov 27, 2010)

I left the card finally in 490 GPU, 1300 Shader, 500 memory (this one is default) and I ran Resident Evil 5 benchmark DX10 mode, Unigine Heaven DX10 and the max temp reached is 40 º so I think the underclocking has some effect anyway and I didn't notice major performance issues so far (all the setting were max except AA which was 2X


----------



## Fourstaff (Nov 27, 2010)

Tatty, you should stop staring at your avvy and start reading 

Nice temps, btw. We can never see 40C in temps in tpu. I think you should keep everything at stock, the card is weak as it is (unless winter fuel is expensive)


----------



## Derek12 (Nov 27, 2010)

Fourstaff said:


> Nice temps, btw. We can never see 40C in temps in tpu. I think you should keep everything at stock, the card is weak as it is (unless winter fuel is expensive)



Yeah but the games I use are relatively old and I don't game much I prefer noiseless computer 
Now I have a critical problem with the onboard:
BTW I removed the video card and now I'm using the onboard (ATI Radeon X2100) for testing purposes and    for giving it a second try, and it's more quiet, but when I ran 3D like WebGL examples (I didn't tried games yet), the 3D ran fine and seconds later the computer hung and 1 sec later  showed a gray scren with white vertical lines , is my onboard video blown? I use the generic Windows 7 drivers... I'll update them soon...
However I'm using Aero and it is fine, no crashes...
Many thanks


----------



## Goodman (Nov 27, 2010)

@Tatty_One , please! post where you got your avatar so we could all have this nice gift for Christmas (full size picture would be nice  )


----------

