# What does the power draw number mean?



## Stigma (Nov 16, 2006)

So, someone tell me:

What does the power draw number mean?

Running at stock 100% load it reports abut 24A (amperes?). At full overclock with the stock fan, I get about 30A.

From what I know I believe that its mainly 12volt input the card uses for the majority of its power needs, but surely this reading can not mean that my card is drawing 12v x 30amps = 360watts! Thats just so far off that I know it must be wrong.

So whats the deal? Is the readout just plain old wrong? I wouldn't assume so because it seemed to scale as I expected going from stock to overclock. Then, if its not wrong, what kind of current are those numbers for?

And of course, the most important bottom-line question: How Do I translate this number into an aproximate of what the card is actually drawing in Watts?

Thats was a lot of questions hehe, I hope someone has some answers for me, and that I didn't miss the answer while I was searching through the older posts.

-Stigma


----------



## ARTOSOFT (Nov 16, 2006)

It is only ~1.7v to drive your GPU, not 12v.

Regards,
Arto.


----------



## Athlon2K15 (Nov 16, 2006)

but you must remember there is more to a graphics card than a GPU


----------



## Namslas90 (Nov 16, 2006)

@ Stigma, it's easy to confuse Amps(the current) with Watts (the ability to do work).  12V X 30A is 360 WATTS.  Why is that so far off.


----------



## ARTOSOFT (Nov 16, 2006)

AthlonX2 said:


> but you must remember there is more to a graphics card than a GPU


What is that?  Graphic card is GPU.

Regards,
Arto.


----------



## ARTOSOFT (Nov 16, 2006)

Namslas90 said:


> @ Stigma, it's easy to confuse Amps(the current) with Watts (the ability to do work).  12V X 30A is 360 WATTS.  Why is that so far off.


If ATITool say draw 30A, it is mean:
~1.7v x 30A = ~51W.

Regards,
Arto.


----------



## Stigma (Nov 16, 2006)

So the amp draw only refers to the actual GPU core then? That would make more sense, except I had expected the GPU to draw more then ~50watts at full load.
Is this confirmed, or are you guys just assuming this is the case now?

Oh, and Artsoft, why do you say 1.7volt for the GPUcore? I'm pretty sure stock Vcore for a x1900XT was 1.425volt (in 3dmodus).

-Stigma


----------



## Namslas90 (Nov 17, 2006)

From my understanding The GPU is but one Intregrated Circuit on the entire Video Card. The other component on the card surely draw a current rating.  I don't dispute what ATI tolls says, and don't realy know how it works. I also think the original statement by stigma was a reading off the box not ATI Tool(pls confirm).  I also know for a fact the the written information on the bax/user manuel is not that accurate.  Furthermore I don't think his card is pulling 360 watts, I was just verifying the math in accordance w/OHM's Law.  Also power calculations are very particular and don't forget to consider losses due to heat dissapation and other basic losses within the card circuit.  I havn't had to do power calculations in about 15 years and can't seam to locate my guide from College.


----------



## ARTOSOFT (Nov 18, 2006)

Stigma said:


> So the amp draw only refers to the actual GPU core then? That would make more sense, except I had expected the GPU to draw more then ~50watts at full load.
> Is this confirmed, or are you guys just assuming this is the case now?
> 
> Oh, and Artsoft, why do you say 1.7volt for the GPUcore? I'm pretty sure stock Vcore for a x1900XT was 1.425volt (in 3dmodus).
> ...


Stigma, since I don't have x1900XT, I don't know exactly the voltage needed by the core.  So I am guessing only.  If my memory serve me right, my card is ~1.3v.

So, if you said vcore for x1900XT was 1.425v, you can calculate the GPU power at full load:
1.425v x 30A = 42.75w.

How sure I am?  It is already debated some times ago in this forum.  So if my memory served me right, it should be correct.

Regards,
Arto.


----------



## Stigma (Nov 18, 2006)

No, actually the 30amp reading was from ATItool. Sorry if that was unclear.

it could be that the 30amp is just for the GPU (running at 1.425volt, but that means that even heavily overclocked it is running at under 50watts on the GPU.

it is of course correct that the GPU is only the main chip on the card, but I know enough about computer parts after about 10 years of build hundreds of computers that I know that the GPU is what draws the vast majority of the power requirements for the card in its entirety. Atleast 90% or so I would suspect.

We know from testing the total power draw from a complete system that a x1900XT card draws somewhere in the area of 90-120watts at full load, and either way I turn this I can not get the numbers to make sense.

If its 30amps of 12volt, thats just way way too much. I can tell outright that this is not the case.

If it is 30 amps of 1.425 volt, thats only ~43watts. There is no way the GPU is that little power (and less than half the cards total power comsumption).

So neither of these hypothesis seem viable to me, which is why I posted the question ehre in the first place. Perhaps I can find some contact info for the author of the program, and he can shed some more light on exactly what this reading means...

-Stigma


----------



## KennyT772 (Nov 18, 2006)

plain and simple the amp reading atitool kicks out is just what is reported by the voltage circitry for the core. so simply when running 700/900 for me i get 32.7amps being drawn by the core. at that speed my card runs at 1.45vcore. the overall draw in watts for the core is just under 50w. however the draw for the entire card is well over 120w. 

why is that?

well as we know from power supplies nothing is 100% efficient. it is almost impossible for the entire 120w to be used on the card, some is lost by the vregs as heat. say that accounts for 20w at max temp and clocks. 

the rest of the current is used by the ramdacs, memory, rage chip, fan, etc etc.


----------



## Stigma (Nov 18, 2006)

Hmm... well I can't discount this possibility, and it does seem pretty logical, but as I said earlier I would expect the GPU to consume the vast majority of total power consumed. There really isn't that much else on the card that outputs a lot of heat.

But your right about power conversion. I doubt much of the 12volt is used as-is, but rather transformed into lower voltage for use on the card, and the conversion process could possibly be very wasteful (I wouldn't even hazzard a guess on what percentage that would be). it is true that the mosfets do generate a fair amount of heat tho, so there must be some noticable loss going on.

-Stigma


----------

