- Joined
- Jul 25, 2006
- Messages
- 12,965 (1.95/day)
- Location
- Nebraska, USA
System Name | Brightworks Systems BWS-6 E-IV |
---|---|
Processor | Intel Core i5-6600 @ 3.9GHz |
Motherboard | Gigabyte GA-Z170-HD3 Rev 1.0 |
Cooling | Quality case, 2 x Fractal Design 140mm fans, stock CPU HSF |
Memory | 32GB (4 x 8GB) DDR4 3000 Corsair Vengeance |
Video Card(s) | EVGA GEForce GTX 1050Ti 4Gb GDDR5 |
Storage | Samsung 850 Pro 256GB SSD, Samsung 860 Evo 500GB SSD |
Display(s) | Samsung S24E650BW LED x 2 |
Case | Fractal Design Define R4 |
Power Supply | EVGA Supernova 550W G2 Gold |
Mouse | Logitech M190 |
Keyboard | Microsoft Wireless Comfort 5050 |
Software | W10 Pro 64-bit |
And now, as then, I did not disagree with you on that. A 100W light bulb burns more energy and produces more heat than a 60W bulb. No duh!we already have had electricity debates together ... a hotter cpu means a hotter house which means more AC usage etc.
But (1) unlike a light bulb, a CPU does NOT consume maximum power all the time, or even most of the time it is powered on, a point you keep forgetting or choose to keep ignoring. In fact, for the vast majority of users, for most 24 hour periods, most CPUs consume a small percentage of their total capacities. More facts you keep forgetting or choose to ignore.
(2) Like those CPUs, the difference between that 100W bulb and that 60W does NOT equate to 40W of "wasted" energy seen in the form of heat. It is significantly less than 40W.
(3) In fact, when doing the same tasks (same amount of "work") the hungrier device can take it easy, consuming less percentage of its capacity which can result in slower cooling fan speeds too - and less power consumption.
(4) In cooler months it works the other way around. That heat can help reduce facility heating costs by keeping your toes toasty.
And (5), if you did the math based on the cost of a kWh (averaging $.13 in the US) you would see it would take a very long time, years even, to make up the difference in the purchase cost of the new CPU. This is exactly why paying extra for a Titanium PSU over a Gold is just not worth it.
Yes, with a more efficient CPU, you consume less power and generate less heat. No disputing that. But it takes a long time and many kWh to burn up $100, as an example, in electricity. A 100W lightbulb burning at full power constantly for 10 solid hours would consume just 1 kWh, or $.13 in cost. It would take over 2 years at that rate to use up $100. I don't care how many videos you are rendering, your CPU is not working at 100% utilization, 10 hours per day, 365 days per year.
And for the record, the OP said the computer is used "mostly" for gaming and web browsing. Only "occasional" video "editing". And of course, video editing is greatly assisted by lots of RAM and depending on the type of rendering done, many of those rendering tasks are done by the GPU, not the CPU. And of course, in many computers, the graphics solution is the biggest power eater, not the CPU.
LOL This is where your arguments turns even more silly. (1) You are "convinced your sample size of one experience renders moot the whole point". And (2) you seem to think rendering a video takes all day long and that's what the average user does every day.you think a 3930k will render a video using the same amount of heat as a new ryzen chip? (dont answer the question its rhetorical)
Rhetorical or not, I never said anything of the sort. So here you are, implying I did say that when in reality, it is just you making stuff up to stroke your own ego! That's pretty sad. And for the record, not once has the OP said anything about considering a 3930K. So even your silly sample size of one is irrelevant here.
I said what I think. Others here have said what they think. None of us need you to fabricate falsehoods about things we didn't say or think.
***********
So here we are full circle again, with me standing by what I actually said - which by the way, you actually agree with! So you arguing here is just trollish nonsense! I say again, with emphasis on the pertinent parts so you can understand,
Bill_Bright said:I would not worry much about electricity usage unless you run your CPU with very high loads nearly 24/7.