# Why don't people use processing to generate our heat



## TheoneandonlyMrK (Dec 10, 2016)

I have thought about this for a long time and definitely many doing folding and crunching DO use this to heat there rooms or homes as I do so I thought I would ask your opinions on it since to me it's a very efficient use of energy to create heat , much better than just pushing loads of current through thin tough wire anyway.
My admittedly expensive over clocked pc could quite easily heat a room(defo a flat maybe a house) and does ,I have to adjust processing load so as to regulate my rooms temperature, well that and windows(real ones) but all in all I get heat , it still costs but also some good gets done ,who looses.
Now to my mind if this were a designed system then a house could be heated easily.


----------



## Solaris17 (Dec 10, 2016)

There are projects in like finland etc IIRC where DCs are poping water to local neighbor hoods and testing heating there homes in exchange for bolting radiators to the side of there house.


----------



## CAPSLOCKSTUCK (Dec 10, 2016)

When my daughter wanted a  heater in her room i installed a 2 P server, after 2 weeks i took it out again.......the noise was stopping her sleeping. It was a very effective heater in her room though.


----------



## TheoneandonlyMrK (Dec 10, 2016)

CAPSLOCKSTUCK said:


> When my daughter wanted a  heater in her room i installed a 2 P server, after 2 weeks i took it out again.......the noise was stopping her sleeping. It was a very effective heater in her room though.


I get that fully but you were improvising to be fair, Noise is the reason I spent so much on this pc, the more rad attached in any way the quieter it's possible to run your fans.
Think about how effective something made for the task could be.
Since I use earbudz to sleep I should probably shut up or spend more myself.


----------



## FordGT90Concept (Dec 10, 2016)

Because a space heater uses something like 1200+ watts where your average computer is doing good to hit 300w without straining a large GPU.  Heater maybe costs $50.  1200+ watt computer would cost thousands.

Space heaters respond to ambient temperatures, computers generally do not.


----------



## alucasa (Dec 10, 2016)

Cost and efficiency is why.


----------



## Smanci (Dec 10, 2016)

Most modern PCs produce so little heat they're hardly usable as space heaters. An i7 6700/GTX 1070 rig will barely warm a small room, with AMD it gets a little hotter though.
Data centers already warm our houses enough.


----------



## TheoneandonlyMrK (Dec 10, 2016)

alucasa said:


> Cost and efficiency is why.


Cost is the why, there is a double efficiency to compute heating ,but a 1200 watt computer will cost a bit I know but in load terms about 6-700 watts continuously in use is adequate in my experience.
@ford yeah I get what you are saying but like I said a machine made for such a thing with possibly chips made to run hotter and obviously something for the pc to do like folding or crunching.
@Smanci , most modern PCs spend most of their lives doing nothing so yeah well spotted but think outside the box , get the box folding or something else and run it at clocks that produce the most heat not the best efficiency or make a special system


----------



## CAPSLOCKSTUCK (Dec 10, 2016)

The 2 P i put in my daughters room was crunching at 100% using two Xeon X5650. Not overclocked on a Supermicro board it was pulling something like 300w at the wall with no addon GPU.


----------



## blobster21 (Dec 10, 2016)

In france we have this parisian startup called Qarnot Computing which developped the Q.Rad, a "smart heater / highend computer" connected to the internet.

The heat production and the electricty bill are free, all you have to do is let this highend computer do cloud computing tasks for third party customers (banks, rendering farms, etc...)

*tech specs:*

TDP 500 w High-end soft heat 
600 GFlops  Computing heat source 
0 dB Totally silent 
0 g carbon footprint
0$ no heating bill
Dimensions 65x60x16 cm (HxWxP)
Heating power 500 W
Weight 27 kg
Noise 0 dB
Power 110/230V AC / 500 W
Network RJ45 ethernet
Computing units 3 high-end Intel® units embedded
Computing power 8 vcores @ 4 Ghz / 16 GB RAM per unit
As for now, the installation of those heaters are only accessible to buildings equipped with fiber chanel, the network bandwidth requirement must be pretty high. Last year, Qarnot Computing signed with Ville de Paris and deployed 350 heaters throughout 100 social housing. Qarnot computing grid is harnessing the power of 1000CPU and is still growing. (source)


----------



## Smanci (Dec 10, 2016)

theoneandonlymrk said:


> Most modern PCs spend most of their lives doing nothing so yeah well spotted but think outside the box , get the box folding or something else and run it at clocks that produce the most heat not the best efficiency or make a special system



I don't want to kill my hardware prematurely and suffer from the noise as it still wouldn't be much of a heater :I
I have been thinking about this idea, too, but it just has too many drawbacks to be a good one.


----------



## thebluebumblebee (Dec 10, 2016)

My 2 GTX 980's, 2 2600K's and an i3-3220T provide most of the heat for my family room.


Smanci said:


> I don't want to kill my hardware prematurely


DC does not over tax hardware.  Maybe some of those really hot 7970's.


----------



## Jetster (Dec 10, 2016)

If your gaming you might as well, or at least turn your heat down when you game.


----------



## 64K (Dec 10, 2016)

Folding and Crunching are good causes but my home is heated by gas which is cheaper than using electricity to heat it.


----------



## R-T-B (Dec 10, 2016)

It's been thought of to mine bitcoin as subsidized heating.

The general excuse I heard from startups in this field as to why it was impractical is the Chinese ASIC ICs when heated tended to release toxins, and if used in a large scale heating environment, would eventually make you sick or kill you.

No idea if this applies to less strenuous computing activities though.


----------



## TheoneandonlyMrK (Dec 10, 2016)

R-T-B said:


> It's been thought of to mine bitcoin as subsidized heating.
> 
> The general excuse I heard from startups in this field as to why it was impractical is the Chinese ASIC ICs when heated tended to release toxins, and if used in a large scale heating environment, would eventually make you sick or kill you.
> 
> No idea if this applies to less strenuous computing activities though.


From my thinking asics would be too efficient, ideally you want as many transistors as possible in as smaller area ie GPUs or modern high core count CPUs in order to have a small manageable area of intense heat production.
I think it's a bit expensive on the smaller scale and less feasibly efficient on a large scale but sound bar costs.
As for electrical equipment releasing toxins while in use cossh and CE safety marks as well as FCC standards being applied should negate such issues no?.


----------



## Ungari (Dec 10, 2016)

Is this another elaborate  setup for an AMD thermal punchline?


----------



## TheoneandonlyMrK (Dec 10, 2016)

Ungari said:


> Is this another elaborate  setup for an AMD thermal punchline?


With the system I've got listed?? I've built it up over years so no and, all modern 8+ core CPUs and modern GPUs run hot it's the law


----------



## R-T-B (Dec 10, 2016)

theoneandonlymrk said:


> From my thinking asics would be too efficient, ideally you want as many transistors as possible in as smaller area ie GPUs or modern high core count CPUs in order to have a small manageable area of intense heat production.



Bitcoin asics are a highly competitive field, and are just as tightly transistor packed as most gpus and many cpus.  I think they have them down to 22nm now...  they were at 28nm when I left.



theoneandonlymrk said:


> As for electrical equipment releasing toxins while in use cossh and CE safety marks as well as FCC standards being applied should negate such issues no?.



Not if you're running it from a high wattage high usage compute standpoint no.  FCC won't even certify compute systems that won't run on a 15 amp 120V line.  I know this because the bitcoin firm Spondoolies tech ran into this.


----------



## TheoneandonlyMrK (Dec 10, 2016)

R-T-B said:


> Bitcoin asics are a highly competitive field, and are just as tightly transistor packed as most gpus and many cpus.  I think they have them down to 22nm now...  they were at 28nm when I left.
> 
> 
> 
> Not if you're running it from a high wattage high usage compute standpoint no.  FCC won't even certify compute systems that won't run on a 15 amp 120V line.  I know this because the bitcoin firm Spondoolies tech ran into this.


I'm sure European law would not allow toxic emissions by electronic device's I've worked to move device's into cossh compliance, I thought FCC standards might be able to cope with that issue?? Outside my knowledge that I'll admit.
Plus I think it could be a matter of tuning the device and design to suit the application.

Few saying it can't be done so I'll give you an example.

I've been folding for years and will do some more so I have the load it's beyond question.

My old 2 bed apartment was heated to reasonable 21 degree ambient temperature in bleak winters it

It used 1000 ishhh watts max but close to 800 most of the time and cost 50 quid on electric not too much with the gas taken off but more yes.

My pc now has twice the rad and hotter GPUs but uses 680 watts typically flat out with a mild Oc and wow it can wack out some heat.

In my present home the heater dial is a battle ground and my room is a freezer so I adapted and overcame, and my room is never cold.
Also I do have a dual gpu rig that's not bad for gaming and I was folding anyway.


----------



## AsRock (Dec 10, 2016)

alucasa said:


> Cost and efficiency is why.




This ^^.

All so the UK aint really that cold, and no worth while PC is going keep our house warm when it's some 0-25c.

Be much more efficient to just wrap up .


----------



## TheoneandonlyMrK (Dec 11, 2016)

AsRock said:


> This ^^.
> 
> All so the UK aint really that cold, and no worth while PC is going keep our house warm when it's some 0-25c.
> 
> Be much more efficient to just wrap up .


You don't live in my house 


blobster21 said:


> In france we have this parisian startup called Qarnot Computing which developped the Q.Rad, a "smart heater / highend computer" connected to the internet.
> 
> The heat production and the electricty bill are free, all you have to do is let this highend computer do cloud computing tasks for third party customers (banks, rendering farms, etc...)
> 
> ...


This is more my thinking.


----------



## R-T-B (Dec 11, 2016)

theoneandonlymrk said:


> I'm sure European law would not allow toxic emissions by electronic device's I've worked to move device's into cossh compliance, I thought FCC standards might be able to cope with that issue?? Outside my knowledge that I'll admit.



I think it's more an issue that the bitcoin china asics are shit quality, lead-laden monstrosities that don't bother with any certs.

Like I said, probably doesn't apply here.


----------



## FreedomEclipse (Dec 11, 2016)

Back in the days when i used to have two 6970s in Xfire that ran up to 80-85'c when gaming.....


----------



## AsRock (Dec 11, 2016)

theoneandonlymrk said:


> You don't live in my house
> 
> This is more my thinking.



Then maybe you should change the title ?, as i understand it you mean more than just you and your house.


----------



## hat (Dec 11, 2016)

Speaking strictly from the heat generation standpoint, aren't computers very inefficient? Correct me if I'm wrong, but isn't heat the result of wasted electricity? For example, a 90% efficient power supply draws 100w from the wall. 90w goes to the components, and the other 10w is lost as heat due to inefficiencies. I imagine components work in much the same way. As the current flows, most of it is actually used to power the components... some of it is lost as heat, because we don't have 100% efficient systems.

Therefore, using computers to generate heat is a bad idea. It's inefficient. Using a heater designed specifically to throw off heat is more efficient than loading a computer to throw off heat. Heaters are cheap compared to computers, and the energy they use is actually being used to heat the room. That said, if you have a computer generating a lot of heat anyway, finding a way to use the heat output wouldn't be such a horrible idea. I used to have heaters like this at one apartment I lived at. It was basically a big radiator that ran along the length of the wall. The fins were electrically heated and dumped heat into the room. Perhaps it would be possible to use some heatpipes to connect something like that to your heatsink base, using the heater as a heatsink. I imagine such a project would be costly and inconvenient, however, even if it did work.


----------



## Static~Charge (Dec 11, 2016)

The OP's idea has more merit in a business environment. If I could capture the waste heat from my server room and blow it around the office, that would put a noticeable dent in our heating bill every winter. Efficiency isn't an issue here because those servers are running 24/7/365, and their heat is a useful by-product (during the cold months, anyway). As it stands now, we're using air conditioners to dump that heat outside.


----------



## therealmeep (Jan 26, 2017)

hat said:


> Speaking strictly from the heat generation standpoint, aren't computers very inefficient? Correct me if I'm wrong, but isn't heat the result of wasted electricity? For example, a 90% efficient power supply draws 100w from the wall. 90w goes to the components, and the other 10w is lost as heat due to inefficiencies. I imagine components work in much the same way. As the current flows, most of it is actually used to power the components... some of it is lost as heat, because we don't have 100% efficient systems.
> 
> Therefore, using computers to generate heat is a bad idea. It's inefficient. Using a heater designed specifically to throw off heat is more efficient than loading a computer to throw off heat. Heaters are cheap compared to computers, and the energy they use is actually being used to heat the room. That said, if you have a computer generating a lot of heat anyway, finding a way to use the heat output wouldn't be such a horrible idea. I used to have heaters like this at one apartment I lived at. It was basically a big radiator that ran along the length of the wall. The fins were electrically heated and dumped heat into the room. Perhaps it would be possible to use some heatpipes to connect something like that to your heatsink base, using the heater as a heatsink. I imagine such a project would be costly and inconvenient, however, even if it did work.


The other 90w of power is also lost as heat in your system as computers are 100% efficient at turning electricity into heat (a small percentage of that technically goes to spinning the fans). Computers don't draw enough power to heat as efficiently as a higher wattage heater.


----------



## Toothless (Jan 26, 2017)

therealmeep said:


> The other 90w of power is also lost as heat in your system as computers are 100% efficient at turning electricity into heat (a small percentage of that technically goes to spinning the fans). Computers don't draw enough power to heat as efficiently as a higher wattage heater.


My two GTX780s heat my room up even if i leave my window open and its 0c outside. It's like a nice comfortable temp in ny room.


----------



## FreedomEclipse (Jan 26, 2017)

Toothless said:


> My two GTX780s heat my room up even if i leave my window open and its 0c outside. It's like a nice comfortable temp in ny room.



I remember when my 6970s used to do that Both cards ran at just under 90'c (stock reference cooler) when playing BF3


----------



## jboydgolfer (Jan 26, 2017)

For many years I've considered things like this, and many different capacities. For instance having a battery cell in the trunk of the vehicle which collects energy  generated by both the vehicles  tires spinning and in particular the friction combined with the spinning when the brakes are applied. There's so many sources of energy wasted similar to that one.  I remember in my old home watching my  Oil furnace burner exhaust, and the dryers exhaust blowing hot air out into the environment and thinking I wish it could've been vented under my driveway to keep ice from forming.

 But at the end of the day at least here in the US companies don't want customers, they want CONSUMErs,in other words people who we buy an item consume it and dispose of it, therefore requiring a new item to replace the first.

 Which is bad for the environment, bad for the consumer, and although it's profitable for the company in the beginning in the short term bad for the company toO


----------



## TheoneandonlyMrK (Jan 26, 2017)

therealmeep said:


> The other 90w of power is also lost as heat in your system as computers are 100% efficient at turning electricity into heat (a small percentage of that technically goes to spinning the fans). Computers don't draw enough power to heat as efficiently as a higher wattage heater.


In my present case my PC is 90% efficient at 24/7 simulations and 100% just right to heat my room.
A heater would be more efficient at heating but it's shit at crysis let alone running Sims.
You shouldn't look upon it solely on heat output , with the right use case it's doubly efficient.


----------



## Boatvan (Jan 26, 2017)

I agree that an enterprise environment would be better suited for this theory. Judging by how much heat this equipment puts out in both our Main Closet and our Server Room, it seems backwards that instead of utilizing the waste heat, we blow A/C in there 24/7. In theory this would make sense, but with most technology budgets, it is easier said than done. It seems at Datacenter level this is already used in some places: http://www.greenbuildingadvisor.com/blogs/dept/building-science/using-server-farms-heat-buildings


----------



## Beertintedgoggles (Jan 26, 2017)

http://www.datacenterknowledge.com/data-centers-that-recycle-waste-heat/


----------



## FordGT90Concept (Jan 26, 2017)

Datacenters should be built adjacent to office buildings.  In the winter, they could use that electronic heat to offset heating costs.


----------

