Thursday, September 14th 2017

Space Heater Concept Reinvented: Qarnot's House Warming Computing Ft. Intel, AMD

Update: Qarnot has updated their page with AMD Ryzen 7 support for its 3 computing units, so it's not limited to Intel offerings. You can see the before and after screenshots on the bottom of this article.

In a move that is sure to bring the cozy, homely warm feeling back towards the space heater concept of yore - who doesn't remember AMD's mocking videos of NVIDIA's Fermi architecture - french company Qarnot has announced their third-generation iteration of a product which is sure to change the Kelvin and Celsius degrees in the computing space. The French company has decided to not let go to waste the (until now) waste heat generated by computing hardware on execution of workloads, and has instead decided to capitalize on those "wasted", byproduct degrees as means of reducing company's and users' heat bills. Their Q.rad concept takes what is usually seen as a drawback in hardware (the amount of waste heat it generates) and turns it into a net positive, by making sure that the heat generated is put to good use in increasing the otherwise chilly temperatures you might be facing.
Their Q.rad sensor makes use of three cloud-enabled Intel Core i7 processors @ 4 GHz frequency - perhaps Ryzen didn't make the cut since it is comparatively more energy efficient per core - as a way of building computing blocks that double as radiators. As Qarnot crunches data (typically, 3D rendering and VFX for film studios), its Q.rad provides up to 500 W "soft heating power" to your home. Reusing heat in such a manner reduces Qarnot's carbon footprint and provides free heating for homes and offices, and also reduces Qarnot's server space overhead, so both parties benefit. This is basically a company thinking outside the box at an old problem, and figuring out an elegant, Columbus-egg-type solution that is obvious, but hadn't been thought of yet.
Basically, the company is asking you to lend them the space that you would be using for a heater anyway, by allowing them to set up a Q.rad in your home. This means that the company needs less space for its server infrastructure, cuts down cooling costs for its cloud computing hardware, and offers you free 500 W of heating power. Electricity costs of the Q.rad's operation are taken into account by an integrated counter, which calculates energy expenses and refunds them to the host. This is the kind of out-of-the-box thinking that allows companies to grow into previously unheard-of spaces and opens up new doors for distributed computing - or is it?
Source: Qarnot's Q.rad
Add your own comment

34 Comments on Space Heater Concept Reinvented: Qarnot's House Warming Computing Ft. Intel, AMD

#1
RejZoR
I've been heating my place with a computer for years now. During the winter, I have to add heating only during the coldest days of the winter. For the rest of the time I can have heating entirely turned off in a room with computer. Not exactly "innovation" then...
Posted on Reply
#2
Raevenlord
News Editor
RejZoRI've been heating my place with a computer for years now. During the winter, I have to add heating only during the coldest days of the winter. For the rest of the time I can have heating entirely turned off in a room with computer. Not exactly "innovation" then...
Not on the concept, no, but the marketability of it is an innovation. They're basically putting their server racks on your home, trading all of those expenses with both space and cooling, for a heating solution. That is where, for me, the genius lies. Turning the concept of "heat in hardware is bad" to "heat in hardware is good" and "I can't wait to have some company's hardware in my home, heating it up through waste heat".
Posted on Reply
#4
RejZoR
RaevenlordNot on the concept, no, but the marketability of it is an innovation. They're basically putting their server racks on your home, trading all of those expenses with both space and cooling, for a heating solution. That is where, for me, the genius lies. Turning the concept of "heat in hardware is bad" to "heat in hardware is good" and "I can't wait to have some company's hardware in my home, heating it up through waste heat".
Perpetually cold climates would love this. Iceland, Siberia, Greenland, Scandinavia, Alaska and norther parts of Canada...
Posted on Reply
#5
Raevenlord
News Editor
BafflesOk, then which of these articles is real?

www.eteknix.com/french-company-using-ryzen-pro-to-heat-homes/

TPU on team blue and Eteknix on team red?
Not to bash ETeknix, but Qarnot's page and bottom picture on this article list on the specifications a trio of i7 processors, so, yeah, I'd say ours is pretty much "realer", unless Qarnot themselves messed up.
Posted on Reply
#6
FordGT90Concept
"I go fast!1!11!1!"
I'm disappointed. I was expecting a 4+ kW central heating core. You know, literally replacing your furnace. When the thermostat tells you "furnace" to turn on, it starts computing like mad and stops when the thermostat tells it to. The problem is the cost. Computer cores cost a crapload more than just metal filaments.

If such a thing existed, I'd be interested as long as the price was reasonable. Even if it all it did was WCG, better than nothing.
Posted on Reply
#7
Unregistered
Free heat? Controllable? Wireless charging? Other stuff as well?

What's the catch?

Also, does the wifi roaming mean free wifi? That would be pretty, uhum, RAD! XD
Posted on Edit | Reply
#8
dwade
Ryzen @ 4ghz is actually less efficient and uses more powa
Posted on Reply
#9
Cybrnook2002
What about liability and responsibility for the unit? Is the household supposed to put on a deposit for it?

What if the kids knock it over, or someone takes it apart (you know someone will). What if your house has an electrical spike and the unit fries, who's footing the bill?

Is my breaker going to keep popping when I turn on a lamp on the same circuit?

No thanks.
Posted on Reply
#10
Raevenlord
News Editor
RaevenlordNot to bash ETeknix, but Qarnot's page and bottom picture on this article list on the specifications a trio of i7 processors, so, yeah, I'd say ours is pretty much "realer", unless Qarnot themselves messed up.
Apparently, both ETkenix and us were right. Qarnot just updated their specifications page to include Ryzen 7 processors, as is reflected in the body of the news piece.
Posted on Reply
#11
trog100
so what happens when the extra free heating isnt required (summer time) but the compute power is.. ??

i can help justify the leccy used by my mining rig in the winter depending where i put the bloody thing.. in the summer its a negative no matter how i look at it.. :)

trog
Posted on Reply
#12
Baffles
RaevenlordApparently, both ETkenix and us were right. Qarnot just updated their specifications page to include Ryzen 7 processors, as is reflected in the body of the news piece.
Cool, thanks for checking. Normally wouldn't bring it up but both articles explicitly excluded the other brand which I felt was very odd.
Posted on Reply
#13
nuwb
trog100so what happens when the extra free heating isnt required (summer time) but the compute power is.. ??

i can help justify the leccy used by my mining rig in the winter depending where i put the bloody thing.. in the summer its a negative no matter how i look at it.. :)

trog
This. Unless they are gonna ship their server/heaters to the other hemisphere every 6 months, these units will be largely dormant. They might be able to get away with lowering the wattage through underclocking, but it will always be a negative in summer.
Posted on Reply
#14
FordGT90Concept
"I go fast!1!11!1!"
Which is why something like WCG is better. They'll take whatever they can get, whenever they can get it.

Likewise, think of compute servers like wind farms. The more the wind blows, the less natural gas turbines have to burn fuel to compensate. Servers installed for heating purposes can do the same compared to servers installed at farms. During the winter, server farms could completely shut down.

I think there's something to the idea but the problem is cost as well as the fact this hardware gets obsolete from the compute perspective quickly. You're average furnace should last 20 years or more. Think of the computer hardware we were using 20 years ago and the problem should be obvious.

Perhaps we're looking at this the wrong way: instead of making computers that act as furnaces, we should have standard radiators that hook up to ridiculously high wattage processors (thousands of watts). That's something that could reasonably be changed every five years or so.


I think this is something the US Department of Energy and National Research Laboratories should seriously look at.
Posted on Reply
#15
yotano211
You want "free heating", just get a bunch of mining machines spread out through the house.
Posted on Reply
#16
R-T-B
FordGT90ConceptI'm disappointed. I was expecting a 4+ kW central heating core. You know, literally replacing your furnace. When the thermostat tells you "furnace" to turn on, it starts computing like mad and stops when the thermostat tells it to. The problem is the cost. Computer cores cost a crapload more than just metal filaments.

If such a thing existed, I'd be interested as long as the price was reasonable. Even if it all it did was WCG, better than nothing.
Bitcoin miner furnace.
Posted on Reply
#18
FordGT90Concept
"I go fast!1!11!1!"
R-T-BBitcoin miner furnace.
Whatever is used has to be more versatile (mining is specific and a fad) and less expensive (GPUs cost a fortune because of the aforementioned fad).
Posted on Reply
#19
R-T-B
FordGT90Conceptmining is specific and a fad
I don't think it's a fad (I expect this to be more the new norm), but regardless, the nature of gpus in compute is far from specific.

Your expense argument makes more sense... depending on the market, anyways.
Posted on Reply
#20
Basard
Yeah... great, but when will we get house COOLING computing? Damn cheapskates...
Posted on Reply
#21
evernessince
dwadeRyzen @ 4ghz is actually less efficient and uses more powa
You won't see a Ryzen processor running at 4 GHz in a server environment. The performance per watt curve starts becoming unfavorable for Ryzen above 3.7 GHz. The same reason you also won't see an Intel processor running above stock either. I don't really get the point of your comment though, it's like pointing out the worst Ryzen could possible do efficiency wise to the best Intel could do. Obviously not a fair comparison.
Posted on Reply
#22
eidairaman1
The Exiled Airman
evernessinceYou won't see a Ryzen processor running at 4 GHz in a server environment. The performance per watt curve starts becoming unfavorable for Ryzen above 3.7 GHz. The same reason you also won't see an Intel processor running above stock either. I don't really get the point of your comment though, it's like pointing out the worst Ryzen could possible do efficiency wise to the best Intel could do. Obviously not a fair comparison.
Says the one without system specs, boo whoo...

It would be nice to take that thermal energy and recycle it back as electricity
Posted on Reply
#23
RejZoR
evernessinceYou won't see a Ryzen processor running at 4 GHz in a server environment. The performance per watt curve starts becoming unfavorable for Ryzen above 3.7 GHz. The same reason you also won't see an Intel processor running above stock either. I don't really get the point of your comment though, it's like pointing out the worst Ryzen could possible do efficiency wise to the best Intel could do. Obviously not a fair comparison.
It's because with servers, they value stability, not raw performance. Overclocking is out of the question there. Workstations, maybe. And even that in small companies where they can manually verify few systems at higher clocks. Besides, servers also value threads over clocks. That's why EPYC runs at clocks around 3GHz or less, but with ridiculous numbers of threads. It's just the raw number of cores that offsets the lower clocks.
Posted on Reply
#24
FordGT90Concept
"I go fast!1!11!1!"
R-T-BI don't think it's a fad (I expect this to be more the new norm), but regardless, the nature of gpus in compute is far from specific.

Your expense argument makes more sense... depending on the market, anyways.
If there were to be use of wide spread compute furnaces, it would have to be designed in a way that can be cheaply mass produced. I'm picturing PCBs with hundreds of ASIC chips sticking out of it (like a slotted processor of yore) so they chip can be cooled from the front, back, and edges. It would also be easy to replace a single unit that failed by plucking it out and sticking another one in. Because of the layout, it doesn't need fancy heat sinks either. There would be no system memory. Just a controller that delegates tasks and gets them access to the network.


If the National Laboratories got behind the initiative, they could pick up the bill for the cost difference.
Posted on Reply
#25
R-T-B
FordGT90ConceptIf there were to be use of wide spread compute furnaces, it would have to be designed in a way that can be cheaply mass produced. I'm picturing PCBs with hundreds of ASIC chips sticking out of it (like a slotted processor of yore)
Litecoin (scrypt) mining ASIC I used to possess:



Not far off.
Posted on Reply
Add your own comment
Nov 22nd, 2024 21:53 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts