• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Are components with high power draw an issue for you?

Are components with high power draw an issue for you?

  • No, I don't care

    Votes: 3,199 14.5%
  • Yes (power bill)

    Votes: 7,382 33.5%
  • Yes (heat)

    Votes: 6,286 28.5%
  • Yes (noise)

    Votes: 2,683 12.2%
  • Yes (environment)

    Votes: 2,490 11.3%

  • Total voters
    22,040
  • Poll closed .
Joined
Jan 14, 2019
Messages
12,334 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
It doesnt bother me, for two reasons:

A) I remember building high end rigs with TRI SLId 480s back in high school with my buddies, complete with OCed 980s. Pulling 250w+ on a CPU isnt anything new, neither are GPUs pushing 400w. This was the era that birthed both the half X case from cooler master and the first 2kW PSUs. This was also the era where OCing a 590 would result in the VRM exploding. So those complaining about power usage and heat output are likely the SAME people that 14 years ago were complaining about the power use and heat output of SLI rigs as if every gaming PC was a space heater.

The vast majority of users are not buying this stuff, for enthusiasts these heat issues and power draw are nothing new, and if anything managing a single GPU that can spike to 600w is a LOT easier then managing 3 400+w OCed watercooled 480s at a time where power supplies, cases, and motherboards were not built with this in mind, and liquid cooling was in its infancy compared to today's ready made solutions. Mainstream stuff like the 6700xt, the 3060ti, the 12400f and the 5600x are not burning up these kinds of numbers, and that mainstream hardware makes up most new gaming PCs.

B) the other complaint is power usage and the environment. Bro, if you're worried about power usage, let me introduce you to: air conditioning, space heaters, dishwashers, electric ovens, and the environmentalist's new boy toy; electric cars. Gaming GPUs are a tiny blip on the radar of energy usage, they're dwarfed by HVAC in total power usage and are fairly rare in the grand scheme of things. Like, me gaming on my PC for multiple hours a day in the winter with nothing else to do? my electric bill is $32 a month. In august, when I'm working outside and hardly touch the PC all month? $176 a month, all over that AC running. And I have a relatively small house, bigger homes can break $300.

Look up how much power a tesla needs to drive 10 miles, look at the governments and individuals pushing HARD for EVs everywhere. I'll save you the google search, a tesla model S uses roughly 3000-3200 watts to travel 10 miles. Traveling 60 MPH means burning up 20,000 watts per hour. And this is the "solution" that will save the planet. Now, that 400 watt 3090 is an issue....why?

Yeah compared to that a gamer using a 400w GPU for a few hours a day really doesnt matter. At all. Commenters will often bring up miners as well in the GPU talk in regards to energy usage, yeah miners use a lot, they are also not comparable to gamers at all. That's like comparing a nascar race to the drive to work in fuel usage. Totally different applications. If one is worried about the environment there are entire FORESTS of low hanging fruit to cut first.
As much as I absolutely don't want a 400+ W graphics card in my PC, I have to agree with you up to a point. A gaming PC's power usage is the least concern in saving the environment. Hypocritical EVs will kill the planet sooner than graphics cards will.

I think heat can be a valid reason to hate on these new, high-power GPUs, although looking at them as "the new SLi" sheds some different light on the matter. There's no reason for an average home gamer to buy anything above a current gen x70-series card.

My personal issue doesn't really come from these overkill parts, but rather from the fact that middle-class components have been creeping up in power consumption too. x60-tier cards eat just shy of 200 W, which was high-end territory a couple generations ago. It doesn't affect my power bill much, but it does affect their usability in a small form factor system where it is much harder to get rid of heat. Let's not even mention the complete lack of low-end / light gaming HTPC cards with low profile / passively cooled versions like the 1050 (Ti) was.
 
Joined
Feb 1, 2019
Messages
3,538 (1.68/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
I see in your sig you also have a GT 1030 like me AusWolf, that card a fine reminder of how the industry has changed. If I was to buy a 4000 series card, it would only be to upgrade my VRAM, and if a 16 gig 4070 comes out, that would perhaps be my target. Which I hope would have a TDP still been reasonably sensible.
 
Joined
Dec 28, 2012
Messages
3,844 (0.89/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
As much as I absolutely don't want a 400+ W graphics card in my PC, I have to agree with you up to a point. A gaming PC's power usage is the least concern in saving the environment. Hypocritical EVs will kill the planet sooner than graphics cards will.

I think heat can be a valid reason to hate on these new, high-power GPUs, although looking at them as "the new SLi" sheds some different light on the matter. There's no reason for an average home gamer to buy anything above a current gen x70-series card.
Heat is going to be a major issue, and would absolutely be a reason I wouldnt want something this power hungry. I did that with SLI a decade ago and more recently vega 64s.
My personal issue doesn't really come from these overkill parts, but rather from the fact that middle-class components have been creeping up in power consumption too. x60-tier cards eat just shy of 200 W, which was high-end territory a couple generations ago. It doesn't affect my power bill much, but it does affect their usability in a small form factor system where it is much harder to get rid of heat.
The power usage has gone up, absolutely, but that's not really because they didnt want to go higher, looking at old high end GPUs; The power consumption average for the last decade, according to TPU: gaming average
480 = 223
580 = 197
680 = 166
780ti = 229
980ti = 211
1080ti = 231
2080ti = 273
3090 = 355

Cards like the 480 were renown for being impossible to cool, hitting 95c immediately at such a low power usage. Meanwhile the 273 watt 2080ti was easy to cool for AIBs, and third party 3090s that stick to stock power usage are the same way. Third party 3090s like the evga 3090 ultra hit only 65c and 36 DBA. The zotac 3090 AMP extreme hits the mid 70c range at just 40 DBA while pulling 455 watts. The old fermi cards ran WAY hotter and were unmanageable despite pulling half the wattage, and were loud as hair dryers to boot. Honestly, side note, its incredible how much cooler design has improved since 2010 for both CPUs and GPUs.

Both the 480 and 580 were pushing the limit of their respective process nodes as well, they couldn't go physically bigger without obliterating yields. Modern nodes have allowed the reticle size to be pushed up. The 280 was the largest GPU nvidia ever made when it released, at 576mm2, the 580 was 520mm2. The 3090 is a whopping 628mm2. (side note the 2080ti was a monsterous 754mm2 and holds the current record for biggest consumer GPU) Couple that with the sheer transistor density of modern tech nodes, and we can actually make these monster GPUs that push power envelopes that simply wouldnt have been possible.

We can see with those midrange 200w cards what would be considered a "3080" if they had stuck to similar TDPs, and that would have left a lot on the table for gamers. If it could have been managed without a nuclear reactor's cooling systems, previous generations would have pushed higher too, but they were limited by node density. The 3000 series ahs been pushed far out of its comfort zone largely by the popular use of boost that pushes these chips to the absolute limit, whereas older generations had OC headroom on the table. Cards like the 500 series could achieve upwards of 50% overclocks with really good coolers. That OCing came with far higher power usage per frame, now that OC is built into newer cards.
Let's not even mention the complete lack of low-end / light gaming HTPC cards with low profile / passively cooled versions like the 1050 (Ti) was.
This has been killer, my 560x HTPC GPU is long in the tooth but neither nvidia or AMD has bothered releasing a 6GB+ GPU in the <75w territory since. the 1650 wasnt bad but its 4GB buffer was a dealbreaker.

I was really hoping the 6500xt would be that GPU, but AMD saw fit to gimp it by clocking it tot the moon and giving it a 64 bit bus.
 
Last edited:

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
41,923 (6.61/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
As much as I absolutely don't want a 400+ W graphics card in my PC, I have to agree with you up to a point. A gaming PC's power usage is the least concern in saving the environment. Hypocritical EVs will kill the planet sooner than graphics cards will.

I think heat can be a valid reason to hate on these new, high-power GPUs, although looking at them as "the new SLi" sheds some different light on the matter. There's no reason for an average home gamer to buy anything above a current gen x70-series card.

My personal issue doesn't really come from these overkill parts, but rather from the fact that middle-class components have been creeping up in power consumption too. x60-tier cards eat just shy of 200 W, which was high-end territory a couple generations ago. It doesn't affect my power bill much, but it does affect their usability in a small form factor system where it is much harder to get rid of heat. Let's not even mention the complete lack of low-end / light gaming HTPC cards with low profile / passively cooled versions like the 1050 (Ti) was.
Thanks for seeing the EV truth that's a diff topic for another discussion.
 
Joined
Oct 27, 2020
Messages
20 (0.01/day)
Unless you have a way of dumping the heat outside, power draw can become an issue. I have a MO-RA3, but I can't reasonably deal with more than 600W sustained power draw, as the room will simply become unbearably hot.
 
Joined
Jul 7, 2019
Messages
908 (0.47/day)
Heat is the main issue. I'll eventually need to install a ductless mini-split just for my computer room if high performance parts keep getting hotter, if only to avoid excess use of the central HVAC system just to keep that room cool. HVAC zoning is unrealistic in my case, given the cost of needing to insulate interior walls and the cost of additional thermostats, motorized vents, and wiring. Alternatively, I could go old school, extreme PC watercooling and just bury a copper coil out in the shaded part of the garden and use an Iwaki pond pump to power all that coolant, poke a hole in the wall, and install a heat exchanger plate that lets me separately plug in the indoor PC to it.

Power isn't an issue (yet), only because I have solar days/free nights going for me; meaning I run off solar during the day and anything over the solar generation rate costs me 2x the normal power price, but at night my power is free. My bill is just for water, gas, and solar now. The real issue is the eventual need to upgrade to 20 amp breakers if high-performance components keep growing in maximum energy requirements even as efficiency increases. I'm just glad my house already came pre-wired for 20 amp loads per standard outlet; just have to replace the breakers. But if they keep getting thirsty, it'll come to the point I'd have to run 240VAC to the computer room just for the PC.
 
Joined
Apr 29, 2014
Messages
4,286 (1.11/day)
Location
Texas
System Name SnowFire / The Reinforcer
Processor i7 10700K 5.1ghz (24/7) / 2x Xeon E52650v2
Motherboard Asus Strix Z490 / Dell Dual Socket (R720)
Cooling RX 360mm + 140mm Custom Loop / Dell Stock
Memory Corsair RGB 16gb DDR4 3000 CL 16 / DDR3 128gb 16 x 8gb
Video Card(s) GTX Titan XP (2025mhz) / Asus GTX 950 (No Power Connector)
Storage Samsung 970 1tb NVME and 2tb HDD x4 RAID 5 / 300gb x8 RAID 5
Display(s) Acer XG270HU, Samsung G7 Odyssey (1440p 240hz)
Case Thermaltake Cube / Dell Poweredge R720 Rack Mount Case
Audio Device(s) Realtec ALC1150 (On board)
Power Supply Rosewill Lightning 1300Watt / Dell Stock 750 / Brick
Mouse Logitech G5
Keyboard Logitech G19S
Software Windows 11 Pro / Windows Server 2016
I mean power bill is always something you have to be concerned with but I think the heat output is more of a concern for me. I love liquid cooling everything and don't mind having to do it, but I worry if we get to some ridiculous levels it can also cause degradation of the component faster. But I think some of it is inevitable in the long run to get higher performance at least for the time being until newer techniques and technologies come out.
 
Joined
Apr 6, 2021
Messages
1,131 (0.86/day)
Location
Bavaria ⌬ Germany
System Name ✨ Lenovo M700 [Tiny]
Cooling ⚠️ 78,08% N² ⌬ 20,95% O² ⌬ 0,93% Ar ⌬ 0,04% CO²
Audio Device(s) ◐◑ AKG K702 ⌬ FiiO E10K Olympus 2
Mouse ✌️ Corsair M65 RGB Elite [Black] ⌬ Endgame Gear MPC-890 Cordura
Keyboard ⌨ Turtle Beach Impact 500
There should be a ban on sensless power wasting electronics. We might even see some (EU) regulations with the emerging energy crisis from the russian sanctions. :confused:

They did it already with the ban of incandescent light bulbs, replaced by LED bulbs. And power limits for vacuum cleaners, pushing cyclone technology. They also forced the tech companies to reduce stand by waste & set power limits to washing machines. And they set power limits to clothes dryer, wich pushed the development of super efficent heat pump dryers.
 

ChromaticWolf

New Member
Joined
Apr 14, 2022
Messages
7 (0.01/day)
I would say Efficiency is my concern, gpus back in the day were better for a couple of gens in terms of power to performance ratio on stock settings, although a good ratio can be achieved through undervolting and limiting some power on last gens, which is what I always do to prolong the lifespan.
 

SuperShermanTanker

New Member
Joined
Mar 6, 2021
Messages
10 (0.01/day)
Yes for all the above reasons aswell, I have a RTX 3000 series GPU and it's already WAYYYY too hot and hungry even after a undervolt
 
D

Deleted member 24505

Guest
Yes for all the above reasons aswell, I have a RTX 3000 series GPU and it's already WAYYYY too hot and hungry even after a undervolt

Water cool, no need to under clock or under volt then. Also it is not dumping heat into the case.
 
Joined
Nov 17, 2010
Messages
131 (0.03/day)
Location
My Computer
System Name Itty Bitty
Processor 5800X3D
Motherboard Asus x570 Tuff Wifi
Cooling 1860mm worth of rads
Memory G Skill Sniper 3600mhz 64gig
Video Card(s) EVGA GeForce RTX 3090 FTW3 Ultra
Storage Gigabyte Aorus 1TB NVME
Display(s) ASUS ROG Swift PG259QNR 360hz + LG CX55 + Sanyo Z3 on a 10ft screen
Case Coolermaster NR600
Audio Device(s) Optical to home theatre amp, EPOS 670
Power Supply Deepcool PQ1000m
Mouse Roccat Kone Pro
Keyboard Roccat Vulcan Pro
Software Win10
Yes for all of the above. I never used to care but I do these days. I live off grid with solar and batteries so my power bill is a non issue but I do wonder what it would be considering I run a 3080 and both my kids run 6900s on high refresh rate monitors. I also have a limit of how much power I can use at any time. I’m also upgrading the 3 PCs to 5800x3ds as they will be amazing upgrades for our PCs rather than the 12900ks that I wanted as the power usage of that is stupid for gaming only PCs compared to the new amd chip. My current pc is my first dabble in custom loop water cooling because I want my vid card to stay at max clock and not lower due getting hot, I have 3 radiators for this and one is 1260mm to keep the pc quiet, my kids have aio’s. Hearing the next gen cards might pull 600w is annoying also as I’d need to buy 3 bigger psu’s and 600w dumped into my loop would be less than ideal. What kind of coolers will they have as standard? 5 slots with delta fans lol?
 
Joined
Apr 1, 2013
Messages
225 (0.05/day)
Wow, people care more either about themselves ("I don't care") or noise, rather than environment...
Ok I understand that the increase in power draw include more heat, thus more fans, thus more noise and in the end power draw determines your bill.

But do you freaking need to spend 300W on overclocking you dGPU memory to gain useless +5% FPS when you already have 300FPS ??
300Wh is an entire machine for many of us.

You can be selfish or blind to environmental problems, fine.
But noise ! Noise guys !!! You care more about a freaking fan than the (only) planet (we have)
:banghead:

Put a freaking headset with noise cancelling and don't tick the box "I care more about the noise of my GPU than Earth".
*Screaming inside*
 
Joined
Jan 14, 2019
Messages
12,334 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Wow, people care more either about themselves ("I don't care") or noise, rather than environment...
Ok I understand that the increase in power draw include more heat, thus more fans, thus more noise and in the end power draw determines your bill.

But do you freaking need to spend 300W on overclocking you dGPU memory to gain useless +5% FPS when you already have 300FPS ??
300Wh is an entire machine for many of us.

You can be selfish or blind to environmental problems, fine.
But noise ! Noise guys !!! You care more about a freaking fan than the (only) planet (we have)
:banghead:

Put a freaking headset with noise cancelling and don't tick the box "I care more about the noise of my GPU than Earth".
*Screaming inside*
I believe I speak for many when I say that I didn't vote environment because a 300 W PC, or even a 300 W graphics card is negligible compared to your fridge, washing machine, vacuum cleaner, electric oven, microwave, etc in power consumption and environmental impact, especially if you only play games a couple hours a day (let alone a couple hours a week).
 
Last edited:
D

Deleted member 24505

Guest
My PC is near silent now, Fair enough my loop has cost near £500, but i do not care, silence is golden and there is no heat pumped into the case from a 300w+ GPU. My CPU and GPU are not OC'd Not much point on the CPU as it is pretty powerful ( yeah lol not heat wise or a presshot ) and the GPU is good enough. For me it is about no noisy ass rig and getting the heat out of the case, other wise i don't give a rats how much power it uses.

Most people on here have high end PC's some with 400w+ GPU, why would these type of people give 2 shits about how much power their PC uses? Its like asking Ferrari owners if they are bothered their car only does 8mpg? of course they facking don't or they would not have a Ferrari. Also why buy a very high end enthusiast GPU and down clock/under volt to lower power use, why not just buy a card that actually does use less power in the first place.
 
Joined
Jul 5, 2013
Messages
27,483 (6.63/day)
You can be selfish or blind to environmental problems, fine.
But noise ! Noise guys !!! You care more about a freaking fan than the (only) planet (we have)
:banghead:
You can stop with the virtue signaling. People pay their own power bills. If they want a higher power card and are willing to pay for the extra electricity, that is their choice not yours. Trying to guilt-trip people with your silly virtue signaling as NOT welcome here and you can expect to be told-off about it.
 
Last edited:
D

Deleted member 24505

Guest
You can stop with the virtue signaling. People pay their own power bills. If they want a higher power card and are willing to pay for the extra electricity, that is their choice not yours. Trying to guilt-trip people with your silly virtue signaling as NOT welcome here and you can expect to be told-off about it.
Agree on power bills.
 
Joined
Apr 1, 2013
Messages
225 (0.05/day)
I believe I speak for many when I say that I didn't vote environment because a 300 W PC, or even a 300 W graphics card is negligible compared to your fridge, washing machine, vacuum cleaner, electric oven, microwave, etc in power consumption and environmental impact, especially if you only play games a couple hours a day (let alone a couple hours a week).
You really don't get it, do you ?

Let's do some math and calculate real numbers in real life. I'm using Wh/day to keep an understandable calculation.

A PC is consuming 500W when playing. You're using it 3h/day in average : 1500Wh/day
A washing machine is consuming 1500Wh per cycle but you do one every day, or at least you share it with your family unlike your PC. Let's coun't 1 cycle every 2 days : 750Wh/day
An oven is consuming 2000W, during 1h (time to heat up the thing and then cook a cake or something) : 2000Wh but again you don't use it every day. Let's think you're quite using it, and it's 1 usage per 2 day : 1000Wh.
A microwave, gosh, 900W during ? 10min per day max ? that's 60Wh/day :D
A fridge+freezer is consuming 1000Wh per day. Piece of cake.
Want to add a dishwasher ? 1500Wh per cycle, same as a washing machine. Same input : 1 every 2 days so 750Wh/per day.

Now let's count your overclocking (or over-consuming) + cooling system : 300W. Same as your computer : 3h a day so 900Wh/day.
Want to add you 2 4K screens ? 150W each : 3h a day so 900Wh again.


So if you run the math you have :
  • washing machine : 750Wh/day
  • dishwasher : 750Wh/day
  • Electric oven : 1000Wh/day
  • Microwave : 60Wh/day
  • Fridge : 1000Wh/day
  • Total : 3560Wh/day
And your computer :
  • Rig : 1500Wh/day
  • Display : 900Wh/day
  • Base : 2400Wh/per day
  • Adding over-consumption : 900Wh/day
  • Total : 3300Wh/day
Do you still think it's negligible ? You clearly never looked at your bills and what is really consuming electricity.
The peak power means nothing compared to the time it runs. Your PC is running a lot more longer than your freaking oven. Even is you have only a single 150W display, it's still consuming a lot.

I always run my games at 60FPS top because I freaking don't need to have 300 FPS even if I could.
Sorry for the long post but many people don't know basic consumption of their equipment.
 
Joined
Jul 5, 2013
Messages
27,483 (6.63/day)
You really don't get it, do you ?
Irony. See below.

So if you run the math you have :
  • washing machine : 750Wh/day
  • dishwasher : 750Wh/day
  • Electric oven : 1000Wh/day
  • Microwave : 60Wh/day
  • Fridge : 1000Wh/day
  • Total : 3560Wh/day
And your computer :
  • Rig : 1500Wh/day
  • Display : 900Wh/day
  • Base : 2400Wh/per day
  • Adding over-consumption : 900Wh/day
  • Total : 3300Wh/day
Not only are your numbers flawed(read ridiculously out of whack) because you are using WH/day instead of KWH/month(because billing happens on a monthly cycle not a daily cycle), but you don't add the power of a PC and it components if it's your daily use machine. You calculate the difference between the average GPU and the GPU in question, in the case of what you were arguing, a 3090. Additionally, not everyone has an electric oven. Most have gas. You also didn't include household lighting or a clothes dryer which can be either fully electric or partly so.

If you are going to throw out numbers and present a theorem, make sure it's within the realms of reality instead of something you pulled from your back-side.
 
Joined
Jan 14, 2019
Messages
12,334 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Irony. See below.


Not only are your numbers flawed(read ridiculously out of whack) because you are using WH/day instead of KWH/month(because billing happens on a monthly cycle not a daily cycle), but you don't add the power of a PC and it components if it's your daily use machine. You calculate the difference between the average GPU and the GPU in question, in the case of what you were arguing, a 3090. Additionally, not everyone has an electric oven. Most have gas. You also didn't include household lighting or a clothes dryer which can be either fully electric or partly so.

If you are going to throw out numbers and present a theorem, make sure it's within the realms of reality instead of something you pulled from your back-side.
Exactly.

@Renald You should not use base numbers from an entire PC as your example. Instead, take the consumption of an overclocked PC versus a regular one on a monthly basis. Then the difference will be negligible. I see there is a part for it in your example. What I don't understand is, 1. how you got to the conclusion that overclocking will consume 300 Watts and 2. why you included everything else in your calculation. We're talking about PC parts with overly high power consumption and whether it's a bad thing or not. We're not talking about throwing your PC out of the window and living life as a hermit.

Edit: By the way, yes, I've looked at my bill. In fact, I look at it every month. I'm not sure where you're from, but in the UK, we have something called a "standing charge" which is a fixed daily amount that you have to pay regardless of your consumption. Believe it or not, the bulk of my bill comes from this and my gas charge, not my electricity usage. And believe it or not, when I go on a trip for a week or two, and don't even turn my PC or TV on, my bill stays nearly the same.

Edit 2: Let's use your example to calculate the costs. I'm currently paying .27p per kWh, which is average for the UK, but expensive for a world average. Your 300 W overclock (I still don't know how you got that number, but nevermind) with your 3 h/day gaming sessions (which is 900 Wh/day) will increase your bill by £7.29 a month. Considering that nearly everybody pays over £100 a month for gas and electricity, and considering that a 300 W overclock is freaking massive, I'd say, yes, it's a pretty negligible increase.
 
Last edited:
Joined
Apr 1, 2013
Messages
225 (0.05/day)
Exactly.

@Renald You should not use base numbers from an entire PC as your example. Instead, take the consumption of an overclocked PC versus a regular one on a monthly basis. Then the difference will be negligible.
Sure let's do this.

30 days a month. So your full house electric equipment consumption goes to 105kWh/month

Let's just consider that your overclocking consumes 300W : GPU overclocked + cooled ; CPU overclocked +cooled ; by choice (manual OC) or by design (native overclocking by the manufacturer). This is relatively accurate right ? Let's consider ... 15h each week (less usage during the weekdays, and more the weekend). That's 2h/day average. So basic math : 600Wh/day : 18kWh per month.

So you're telling me that an overclock of 2 components that represent a fifth of your whole consumption (washing machine, dishwasher, oven, microwave, fridge) is negligible ?
20% over your whole damn kitchen/bathroom heavy electronic consumption is nothing ? It's the consumption of a dryer (for cloth not hair) for a family of 4, heavily used, just to overclock 2 components and you think it's negligible ?
It's not going to kill any bird or fish, even accumulated over thousands of people. But 20% to gain what ? 5% perfs you won't need ? 10 sec out of 200s gained on an encoding for that much energy ? It's really a waste considering our planet current state. That's my point.


And even if you don't think it's relevant, I think it's good to mentioned that a running computer or console is using a lot more energy than people would think.
If you live in a hot country(like 30°C at night/40°C during day in summer), and don't have an air conditioning system, you'll find very fast that your PC is a living heater. That's like putting the heater on during summer. Even with a modest rig and display, you use 300-400W while using it (whole system), which is unbearable in summer.
In winter, if there's 0-5°C outside, my rig is enough to warm up a 10m² room, no heater needed.

Last point :
Besides any numbers, and you, I and many will agree or already do, you're right, it's basically nothing compared to other pollution sources. But referring to my previous calculation, imposing a 20% growth on the energy bill/consumption by design like Intel is actually using a lot, is not the right way. It's not right to claim the crown over some highest score by OC natively at maximum all your CPU, just for stupid marketing BS. AMD does it right (mainly) and let you decide if you want to burn down twice as much energy for 5FPS on a CPU, meanwhile Intel is creating energy monsters just to beat AMD on the line of the FPS crown on games.
This is futile, mainly regarding the environment, but also for your bill, your heat problems (depends on where you live). Not caring on any point is really selfish, but the world is made of everything. Fine. But the noise. It's just a joke right ?

Not only are your numbers flawed(read ridiculously out of whack) because you are using WH/day instead of KWH/month(because billing happens on a monthly cycle not a daily cycle)
I used basic constructor numbers 1.5kWh per cycle for a washing machine, given you're using it once every 2 days. I'm happy to read any "real numbers" you claim is not "out of whack" :)
I just simplified (and explained how) I showed that consuming 2kW at peak isn't relevant.

but you don't add the power of a PC and it components if it's your daily use machine. You calculate the difference between the average GPU and the GPU in question, in the case of what you were arguing, a 3090.
That's in my answer : not arguing on how much you overclock and how much is the base, my separation is clear : if your OC (manual or as designed) uses 200W more than a same chip, it's still 200Wh per hour of running. Make it 100W if you like, doesn't change a thing regarding the result : basically a negligible 5% gain (sometimes nothing).

Additionally, not everyone has an electric oven. Most have gas. You also didn't include household lighting or a clothes dryer which can be either fully electric or partly so.

If you are going to throw out numbers and present a theorem, make sure it's within the realms of reality instead of something you pulled from your back-side.
I'm sorry to tell you that mainly, on the old continent (Europe) we use electrical oven. Not gas. Not everyone is living like USA citizens.
Fair enough somehow, let's consider 50/50 ;)


Put your math on, show us the world if all my numbers are incorrect but I don't think you'll find anything out of reality :
Some examples :

1650498709179.png
1650498826380.png



You can find find hell lot less consuming equipment, but it will just prove my point even more : it's not normal to double the energy consumption of a CPU to gain 5% perfs. It's a waste of energy.
You don't care. Live with it, I'm fine, but don't tell me my numbers are wrong. I took huge numbers just to prove that a PC is more costly than every equipment you have and will use.

Exactly.

@Renald You should not use base numbers from an entire PC as your example. Instead, take the consumption of an overclocked PC versus a regular one on a monthly basis. Then the difference will be negligible. I see there is a part for it in your example. What I don't understand is, 1. how you got to the conclusion that overclocking will consume 300 Watts and 2. why you included everything else in your calculation. We're talking about PC parts with overly high power consumption and whether it's a bad thing or not. We're not talking about throwing your PC out of the window and living life as a hermit.

Edit: By the way, yes, I've looked at my bill. In fact, I look at it every month. I'm not sure where you're from, but in the UK, we have something called a "standing charge" which is a fixed daily amount that you have to pay regardless of your consumption. Believe it or not, the bulk of my bill comes from this and my gas charge, not my electricity usage. And believe it or not, when I go on a trip for a week or two, and don't even turn my PC or TV on, my bill stays nearly the same.

Edit 2: Let's use your example to calculate the costs. I'm currently paying .27p per kWh, which is average for the UK, but expensive for a world average. Your 300 W overclock (I still don't know how you got that number, but nevermind) with your 3 h/day gaming sessions (which is 900 Wh/day) will increase your bill by £7.29 a month. Considering that nearly everybody pays over £100 a month for gas and electricity, and considering that a 300 W overclock is freaking massive, I'd say, yes, it's a pretty negligible increase.
On your Edit : sorry to hear that. During my vacation or summer I pay mainly my fridge, nothing more. Around 30€/month in spring/summer/fall and twice in winter months.
This is my yearly bill for last year for a 70m² flat (I'm fully electric, no gas, no fuel, no wood):

1650499487492.png
1650499515778.png

30% of this is the fee just to get electricity
10% is taxes
The rest is my "real bill". I don't have any discount or anything to decrease it.

So let's take into account what you pay in my bill :D
Let's say it's 8€ per month, so 96€ per year. So as I said 20% increase (just consider 900Wh/day, say you're an enthusiast OC that uses its PC 9 hours a day if it's more tangible).
Why 300W if you ask me ? because if you compare a top notch Intel power monster to an AMD high end CPU, there's already 100W difference IIRC. Also, a large scope of GPU are natively OC for no reason or running (that's the owner choice) at max FPS where their screen can only display at 60Hz on your screen.
On top of that, you have to cool it, which requires 2 pumps. That's not much, but it's a whole (and you also lose in efficiency on your PSU if using it near max capacity) and you'll probably have a fan (a real one) or an air conditioning system.
300W may be exaggerated, you can remove a 100W if you want. It's still around 15% to my bill.

I think it mainly depends on how much you already pay. If it's true in many European continental countries, it's not for UK/Ireland/USA.


Edit: Sorry for my writing, it's really late :D
 
Last edited:
Joined
Jul 5, 2013
Messages
27,483 (6.63/day)
Not everyone is living like USA citizens.
And not everyone is living in the EU. Most of the world has access to natural gas and most of the world uses gas for heating and cooking.
show us the world if all my numbers are incorrect but I don't think you'll find anything out of reality
I'll take a stab at it, and I won't need anything other than what YOU provided...
I used basic constructor numbers 1.5kWh per cycle for a washing machine
You said this, then you showed...

WashingMachinePowerUseGraph.jpg

...this.

but don't tell me my numbers are wrong.
What was that?


This is the second example of direct dishonesty on your part. I'm not wasting any more time with your silly self. Stop with the disinformation, stop with the dishonesty and stop with the virtue signaling.
 
Last edited:
Joined
Nov 11, 2016
Messages
3,393 (1.16/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
I always run my games at 60FPS top because I freaking don't need to have 300 FPS even if I could.
Sorry for the long post but many people don't know basic consumption of their equipment.

How wasteful, I play games at 720p 30FPS to save power, how dare you play game at 60FPS :roll:
 
Joined
Jan 14, 2019
Messages
12,334 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Sure let's do this.

30 days a month. So your full house electric equipment consumption goes to 105kWh/month

Let's just consider that your overclocking consumes 300W : GPU overclocked + cooled ; CPU overclocked +cooled ; by choice (manual OC) or by design (native overclocking by the manufacturer). This is relatively accurate right ? Let's consider ... 15h each week (less usage during the weekdays, and more the weekend). That's 2h/day average. So basic math : 600Wh/day : 18kWh per month.

So you're telling me that an overclock of 2 components that represent a fifth of your whole consumption (washing machine, dishwasher, oven, microwave, fridge) is negligible ?
20% over your whole damn kitchen/bathroom heavy electronic consumption is nothing ? It's the consumption of a dryer (for cloth not hair) for a family of 4, heavily used, just to overclock 2 components and you think it's negligible ?
It's not going to kill any bird or fish, even accumulated over thousands of people. But 20% to gain what ? 5% perfs you won't need ? 10 sec out of 200s gained on an encoding for that much energy ? It's really a waste considering our planet current state. That's my point.


And even if you don't think it's relevant, I think it's good to mentioned that a running computer or console is using a lot more energy than people would think.
If you live in a hot country(like 30°C at night/40°C during day in summer), and don't have an air conditioning system, you'll find very fast that your PC is a living heater. That's like putting the heater on during summer. Even with a modest rig and display, you use 300-400W while using it (whole system), which is unbearable in summer.
In winter, if there's 0-5°C outside, my rig is enough to warm up a 10m² room, no heater needed.

Last point :
Besides any numbers, and you, I and many will agree or already do, you're right, it's basically nothing compared to other pollution sources. But referring to my previous calculation, imposing a 20% growth on the energy bill/consumption by design like Intel is actually using a lot, is not the right way. It's not right to claim the crown over some highest score by OC natively at maximum all your CPU, just for stupid marketing BS. AMD does it right (mainly) and let you decide if you want to burn down twice as much energy for 5FPS on a CPU, meanwhile Intel is creating energy monsters just to beat AMD on the line of the FPS crown on games.
This is futile, mainly regarding the environment, but also for your bill, your heat problems (depends on where you live). Not caring on any point is really selfish, but the world is made of everything. Fine. But the noise. It's just a joke right ?


I used basic constructor numbers 1.5kWh per cycle for a washing machine, given you're using it once every 2 days. I'm happy to read any "real numbers" you claim is not "out of whack" :)
I just simplified (and explained how) I showed that consuming 2kW at peak isn't relevant.


That's in my answer : not arguing on how much you overclock and how much is the base, my separation is clear : if your OC (manual or as designed) uses 200W more than a same chip, it's still 200Wh per hour of running. Make it 100W if you like, doesn't change a thing regarding the result : basically a negligible 5% gain (sometimes nothing).


I'm sorry to tell you that mainly, on the old continent (Europe) we use electrical oven. Not gas. Not everyone is living like USA citizens.
Fair enough somehow, let's consider 50/50 ;)


Put your math on, show us the world if all my numbers are incorrect but I don't think you'll find anything out of reality :
Some examples :

View attachment 244339View attachment 244340


You can find find hell lot less consuming equipment, but it will just prove my point even more : it's not normal to double the energy consumption of a CPU to gain 5% perfs. It's a waste of energy.
You don't care. Live with it, I'm fine, but don't tell me my numbers are wrong. I took huge numbers just to prove that a PC is more costly than every equipment you have and will use.


On your Edit : sorry to hear that. During my vacation or summer I pay mainly my fridge, nothing more. Around 30€/month in spring/summer/fall and twice in winter months.
This is my yearly bill for last year for a 70m² flat (I'm fully electric, no gas, no fuel, no wood):

View attachment 244343View attachment 244344
30% of this is the fee just to get electricity
10% is taxes
The rest is my "real bill". I don't have any discount or anything to decrease it.

So let's take into account what you pay in my bill :D
Let's say it's 8€ per month, so 96€ per year. So as I said 20% increase (just consider 900Wh/day, say you're an enthusiast OC that uses its PC 9 hours a day if it's more tangible).
Why 300W if you ask me ? because if you compare a top notch Intel power monster to an AMD high end CPU, there's already 100W difference IIRC. Also, a large scope of GPU are natively OC for no reason or running (that's the owner choice) at max FPS where their screen can only display at 60Hz on your screen.
On top of that, you have to cool it, which requires 2 pumps. That's not much, but it's a whole (and you also lose in efficiency on your PSU if using it near max capacity) and you'll probably have a fan (a real one) or an air conditioning system.
300W may be exaggerated, you can remove a 100W if you want. It's still around 15% to my bill.

I think it mainly depends on how much you already pay. If it's true in many European continental countries, it's not for UK/Ireland/USA.


Edit: Sorry for my writing, it's really late :D
OK, let me make my point clear before we make assumptions. :)

I'm not saying that a 5% performance increase that costs 20% more energy usage isn't wasteful. It definitely is when you consider heat, noise and other factors. This is why I personally never overclock - if you don't count raising power limits, that is.

What I'm saying is that you need to look at the bigger picture if you're thinking about the planet. Logistics companies with gas-guzzling diesel lorries, manufacturing industries, coal power plants (some parts of the world still have a lot of them), huge server/crypto farms, I think these effect the environment infinitely more than your PC at home.

As for power bills, like I said, most UK households pay way over £100 a month just for their basic electricity and gas needs. I know families who pay just shy of £200 because they have kids and they live in a detached house with no heating on the sides from neighbours like I have in my apartment. A pretty significant 300 W overclock that raises your bill by £7.20 a month (or a more realistic overclock of 150 W that's worth £3.60) is totally insignificant.

As for wasteful components by design, I would agree if you didn't use (Intel) CPUs as your example. It's easy to hate on a chip's maximum power consumption, but no one said that you're using it 100% all the time. My 11700 that throttles in Prime95 due to my motherboard's 200 W power limit, barely uses its factory 65 W limit in most games.

Let's also mention that idle efficiency is great all across the board, and that's where most PCs sit most of their times.
 
Top