• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

How much power does your PC use?

well thats easy then, we ignore the GPU aspect of this benchmark and use another one.

recommendations? something easily downloaded (small file size!) and installed that gives a score preferably, that works with multi GPU
How about OpenCL Julia4D in MSI Kombustor 3.5? It works with two GPUs (at least for me,) and seemingly almost fully loads both GPGPU processors. Not all Kombustor modes will use CFX (or even correctly for that matter,) but this one seems to be the most balanced for multi-gpu systems out of the ones I've tested in Kombustor. Also unlike most windowed applications, it will use multi-gpu without full screen being enabled which is a huge plus in my personal opinion.
Capture.JPG

Edit: It's worth noting that not all OpenCL applications fully utilize all the parts of a GPU so power consumption won't be a maximum figure. This really is more of a benchmark than a power consumption gauge, at least with OpenCL Julia4D.
 
i dont think openCL is a valid way to measure it, because openCL performance varies vs regular gaming performance.
 
i dont think openCL is a valid way to measure it, because openCL performance varies vs regular gaming performance.
Regular games will also tend to use CFX. I'm just trying to hit as many things as possible and the only thing this misses is ful utilization of the GPU's components. It's a better measure than CineBench's OpenGL, that's for sure.
 
i think we need a DX11 test.

maybe heaven? we can always go something easy like 1024x768 or 1280x720 fullscreen that everyone can run.
 
I do not have a wall meter but I can disconnect everything and let only the computer powered on and measure its consumption by getting data from my electricity meter.

May I do it that way? If yes, I'll be posting my results tomorrow (too late right now).

Thanks!
That wont be accurate at all.
I did some tests and it was accurate enough (I think):

Idle = 64W
CPU @ ~100% = 75W (video conversion task)
CPU + GPU @ ~100% = 110W (video conversion + furmark)

60W light bulb = 62,91W (63W).

It took just a few seconds to get the data... And I had a cell phone charging while measuring the data if that counts as additional energy.

But if you say it is not accurate enough to be included in your list, then I respect your decision. But saying it is "NOT ACCURATE AT ALL", I believe it is a bit exaggerated.

Sorry!
 
I did some tests and it was accurate enough (I think):

Idle = 64W
CPU @ ~100% = 75W (video conversion task)
CPU + GPU @ ~100% = 110W (video conversion + furmark)

60W light bulb = 62,91W (63W).

It took just a few seconds to get the data... And I had a cell phone charging while measuring the data if that counts as additional energy.

But if you say it is not accurate enough to be included in your list, then I respect your decision. But saying it is "NOT ACCURATE AT ALL", I believe it is a bit exaggerated.

Sorry!

you may be missing a dozen devices using 1-2W each. a phone can use 10W, the charger without a phone attached 1-2W, etc.
 
you may be missing a dozen devices using 1-2W each. a phone can use 10W, the charger without a phone attached 1-2W, etc.
Sorry, I did not understand what you really mean with all that.
 
you may be missing a dozen devices using 1-2W each. a phone can use 10W, the charger without a phone attached 1-2W, etc.
He's using a modern Celeron and a GeForce 9500GT, a rig like that is not going to use much power. Now something like my rig will eat up 200 watts easy just idling (not including the monitors, which is 45 watts between the 3 of them.) When stuff starts getting power gated, I'll see usages more like 185 watts at idle. Granted I also have two graphics cards, 5 HDDs, 2 SSDs, a skt2011 CPU, and quad-channel memory. So there are plenty of places all of that power can be going. :p
IMG_4502.JPG
 
299+162w powering my computer (system specs), my server, a 24" CCFL LCD, a 19" CCFL LCD, a KVM, a 8-port gigabit switch, and external drive, and both systems are running BOINC at 100% (GPUs not loaded). There's 9 hard drives among them and both have 85+ Silver or better power supplies.

I don't rightly know which monitor is plugged into which UPS but they really need to be taken together because they're dependent on each other.
 
There's 9 hard drives among them and both have 85+ Silver or better power supplies.
This makes a good difference, mine is a 80 PLUS standard... I guess I could save precious watts by getting a 80 PLUS gold.

I'd like to change my GPU as well, it is quite old already... And I forgot to mention I had 1 modem and 1 router powered on while measuring PC power.
 
Last edited:
Add my result

TRWOV | Cooler Master Silent Pro 700M (Bronze) | FX8350 @ 4.4Ghz + 7970 @ 1100/1600 | 79W | 220W | P9454 365W | 25.90 | http://www.3dmark.com/3dm11/8982998


Not too bad, I was expecting to hit 400w or something considering the 8350 and the 80+ Bronze PSU. CPU is OCed with stock voltages, the GPU is set at 1.150v on the core (stock was 1.2v)


BTW, when getting your readings don't keep Chrome open. With Chrome open I was hitting 100w on idle :confused: Having other programs open was fine (Internet Explorer, Word, etc) but with Chrome I got higher readings for some reason.

EDIT: found this: http://lifehacker.com/google-chrome-kills-battery-on-windows-faster-than-ie-o-1605816299


EDIT2: FX-GMC what's the Vcore on your 8320 @ 4.6? I see that you got a peak of 405w on 3dmark11 with a 760 :confused:
 
Last edited:
if you disable hardware acceleration in flash within chrome, power usage drops a lot. i have noticed chrome can up power consumption vs other browsers as well.
 
Idle = 64W
CPU @ ~100% = 75W (video conversion task)
CPU + GPU @ ~100% = 110W (video conversion + furmark)
Now look the readings with my system underclocked:

Idle = 58W (-9.37%)
CPU @ ~100% = 62W (video conversion task) (-17.33%)
CPU + GPU @ ~100% = 82W (video conversion + furmark) (-25.45%)

Details:
Memory from 1333MHz to 800MHz.
CPU from 2.6GHz to 1.6GHz + all extra cores deactivated.
GPU clocks (original / underclocked): 550-500-1350 / 250-500-625 MHz

Amazing, that shows the power of underclocking! I'm ready for summer now! :)
 
Since we havent had any GTX970s on the board yet - I'll get around to doing a run in the next hour or so
 
@Mussels

FreedomEclipse | Corsair AX850 80Plus Platinum (94% Efficiency@50%) | i7-3930k @ 4.6GHz + GTX 790 SLi | 120w idle | 381w | P22882 627W | 36.39 points per watt | Stock GPU + Belkin Conserve Insight power meter

::EDIT::

its funny that, for graphic cards that are supposed to be more power efficient than the GTX680s i was running previously, they drew more power running the first stage of 3DMark11....

Ive run the benchmark 3 times just to make sure that i wasnt trippin balls.
 
Last edited:
@Mussels

FreedomEclipse | Corsair AX850 80Plus Platinum (94% Efficiency@50%) | i7-3930k @ 4.6GHz + GTX 790 SLi | 120w idle | 381w | P22882 627W | 36.39 points per watt | Stock GPU + Belkin Conserve Insight power meter

::EDIT::

its funny that, for graphic cards that are supposed to be more power efficient than the GTX680s i was running previously, they drew more power running the first stage of 3DMark11....

Ive run the benchmark 3 times just to make sure that i wasnt trippin balls.

removing a bottleneck that existed on the first harwdare can often allow the new stuff to chew more power, despite being more efficient. like if your GPU was limiting you, now more of your CPU can be used, so overall wattage goes up.
 
removing a bottleneck that existed on the first harwdare can often allow the new stuff to chew more power, despite being more efficient. like if your GPU was limiting you, now more of your CPU can be used, so overall wattage goes up.

kind of strange how the wprime went up as well though. I might need to recheck my voltages in the bios
 
kind of strange how the wprime went up as well though. I might need to recheck my voltages in the bios

video card less efficient at idle? anything like a web browser or video playback that could have forced it to low 3D?
 
I've just replaced the GTX 660 OEM from my previous entry with a R9 290. Fiddling around with clockspeeds and voltage today, then I'll tally up the results in the next couple days.

i7-3930k @ 4.6GHz + GTX 790 SLi
Think you have a typo there, or perhaps some very exotic video cards ;)
 
My usual power usage (per watt-o-meter reading) is 400W while gaming. I've had it peak at 480-500W when i had my HD7950 overclocked to 1300/7000 and under some really heavy gaming load in Natural Selection 2 (max details and 40 people on server).
 
My usual power usage (per watt-o-meter reading) is 400W while gaming. I've had it peak at 480-500W when i had my HD7950 overclocked to 1300/7000 and under some really heavy gaming load in Natural Selection 2 (max details and 40 people on server).

vsync on or off? thats a lot higher than my system, and apart from your CPU our systems are fairly similar.
 
Vsync OFF. Be aware that Core i7 920 is quite more power hungry than Core i7 2600. And HD7950 @ 1300/7000 was eating power like mad. I think i was pushing it at 1,35V and with modified TDP limit at 300W (stock is around 175W). I had to back it up a bit because i was sometimes getting graphic glitches and wasn't 100% stable. But it was maaaaaaad.
 
My 4P Opteron Server(48 cores) draws 780W at the wall under full load with a EVGA Supernova 1000 G2 Gold. Used to pull 870w with a older PCP&C Silencer 750 Mark IV rated Bronze. This is all CPU and no GPU's.
 
My 4P Opteron Server(48 cores) draws 780W at the wall under full load with a EVGA Supernova 1000 G2 Gold. Used to pull 870w with a older PCP&C Silencer 750 Mark IV rated Bronze. This is all CPU and no GPU's.
I take it that's with the opterons running overclocked? 780w seems a little high for stock but I could be wrong.
 
He's using a modern Celeron and a GeForce 9500GT, a rig like that is not going to use much power. Now something like my rig will eat up 200 watts easy just idling (not including the monitors, which is 45 watts between the 3 of them.) When stuff starts getting power gated, I'll see usages more like 185 watts at idle. Granted I also have two graphics cards, 5 HDDs, 2 SSDs, a skt2011 CPU, and quad-channel memory. So there are plenty of places all of that power can be going. :p
View attachment 60118

Yikes! I have the same cpu as you and a 290X and my system idles at around 78w. How much of a difference a few hard drives make.
 
Back
Top