Thursday, September 14th 2017
Space Heater Concept Reinvented: Qarnot's House Warming Computing Ft. Intel, AMD
Update: Qarnot has updated their page with AMD Ryzen 7 support for its 3 computing units, so it's not limited to Intel offerings. You can see the before and after screenshots on the bottom of this article.
In a move that is sure to bring the cozy, homely warm feeling back towards the space heater concept of yore - who doesn't remember AMD's mocking videos of NVIDIA's Fermi architecture - french company Qarnot has announced their third-generation iteration of a product which is sure to change the Kelvin and Celsius degrees in the computing space. The French company has decided to not let go to waste the (until now) waste heat generated by computing hardware on execution of workloads, and has instead decided to capitalize on those "wasted", byproduct degrees as means of reducing company's and users' heat bills. Their Q.rad concept takes what is usually seen as a drawback in hardware (the amount of waste heat it generates) and turns it into a net positive, by making sure that the heat generated is put to good use in increasing the otherwise chilly temperatures you might be facing.Their Q.rad sensor makes use of three cloud-enabled Intel Core i7 processors @ 4 GHz frequency - perhaps Ryzen didn't make the cut since it is comparatively more energy efficient per core - as a way of building computing blocks that double as radiators. As Qarnot crunches data (typically, 3D rendering and VFX for film studios), its Q.rad provides up to 500 W "soft heating power" to your home. Reusing heat in such a manner reduces Qarnot's carbon footprint and provides free heating for homes and offices, and also reduces Qarnot's server space overhead, so both parties benefit. This is basically a company thinking outside the box at an old problem, and figuring out an elegant, Columbus-egg-type solution that is obvious, but hadn't been thought of yet.Basically, the company is asking you to lend them the space that you would be using for a heater anyway, by allowing them to set up a Q.rad in your home. This means that the company needs less space for its server infrastructure, cuts down cooling costs for its cloud computing hardware, and offers you free 500 W of heating power. Electricity costs of the Q.rad's operation are taken into account by an integrated counter, which calculates energy expenses and refunds them to the host. This is the kind of out-of-the-box thinking that allows companies to grow into previously unheard-of spaces and opens up new doors for distributed computing - or is it?
Source:
Qarnot's Q.rad
In a move that is sure to bring the cozy, homely warm feeling back towards the space heater concept of yore - who doesn't remember AMD's mocking videos of NVIDIA's Fermi architecture - french company Qarnot has announced their third-generation iteration of a product which is sure to change the Kelvin and Celsius degrees in the computing space. The French company has decided to not let go to waste the (until now) waste heat generated by computing hardware on execution of workloads, and has instead decided to capitalize on those "wasted", byproduct degrees as means of reducing company's and users' heat bills. Their Q.rad concept takes what is usually seen as a drawback in hardware (the amount of waste heat it generates) and turns it into a net positive, by making sure that the heat generated is put to good use in increasing the otherwise chilly temperatures you might be facing.Their Q.rad sensor makes use of three cloud-enabled Intel Core i7 processors @ 4 GHz frequency - perhaps Ryzen didn't make the cut since it is comparatively more energy efficient per core - as a way of building computing blocks that double as radiators. As Qarnot crunches data (typically, 3D rendering and VFX for film studios), its Q.rad provides up to 500 W "soft heating power" to your home. Reusing heat in such a manner reduces Qarnot's carbon footprint and provides free heating for homes and offices, and also reduces Qarnot's server space overhead, so both parties benefit. This is basically a company thinking outside the box at an old problem, and figuring out an elegant, Columbus-egg-type solution that is obvious, but hadn't been thought of yet.Basically, the company is asking you to lend them the space that you would be using for a heater anyway, by allowing them to set up a Q.rad in your home. This means that the company needs less space for its server infrastructure, cuts down cooling costs for its cloud computing hardware, and offers you free 500 W of heating power. Electricity costs of the Q.rad's operation are taken into account by an integrated counter, which calculates energy expenses and refunds them to the host. This is the kind of out-of-the-box thinking that allows companies to grow into previously unheard-of spaces and opens up new doors for distributed computing - or is it?
34 Comments on Space Heater Concept Reinvented: Qarnot's House Warming Computing Ft. Intel, AMD
www.eteknix.com/french-company-using-ryzen-pro-to-heat-homes/
TPU on team blue and Eteknix on team red?
If such a thing existed, I'd be interested as long as the price was reasonable. Even if it all it did was WCG, better than nothing.
What's the catch?
Also, does the wifi roaming mean free wifi? That would be pretty, uhum, RAD! XD
What if the kids knock it over, or someone takes it apart (you know someone will). What if your house has an electrical spike and the unit fries, who's footing the bill?
Is my breaker going to keep popping when I turn on a lamp on the same circuit?
No thanks.
i can help justify the leccy used by my mining rig in the winter depending where i put the bloody thing.. in the summer its a negative no matter how i look at it.. :)
trog
Likewise, think of compute servers like wind farms. The more the wind blows, the less natural gas turbines have to burn fuel to compensate. Servers installed for heating purposes can do the same compared to servers installed at farms. During the winter, server farms could completely shut down.
I think there's something to the idea but the problem is cost as well as the fact this hardware gets obsolete from the compute perspective quickly. You're average furnace should last 20 years or more. Think of the computer hardware we were using 20 years ago and the problem should be obvious.
Perhaps we're looking at this the wrong way: instead of making computers that act as furnaces, we should have standard radiators that hook up to ridiculously high wattage processors (thousands of watts). That's something that could reasonably be changed every five years or so.
I think this is something the US Department of Energy and National Research Laboratories should seriously look at.
Your expense argument makes more sense... depending on the market, anyways.
It would be nice to take that thermal energy and recycle it back as electricity
If the National Laboratories got behind the initiative, they could pick up the bill for the cost difference.
Not far off.