# Your next water cooler may be much smaller.



## lilhasselhoffer (Oct 6, 2015)

So, I found the following bit of experimental information:
http://www.news.gatech.edu/2015/10/05/liquid-cooling-moves-chip-denser-electronics

The goal was to improve water cooling technology, and make it more efficient.  They took a silicon die, cut micro channels into the back of it, and pumped deionized water through the micro channels.

The result was a 60% performance increase from standard air coolers (though standard water cooling wasn't explored).


Caveat emptor, the technology basically requires that the microchannels never build up sediment (the "large" silicon cylinders inside the channels designed to increase surface area were 100 microns in diameter).




So, anyone else want to start delidding processors, etching channel, and getting that extra performance?


----------



## EarthDog (Oct 6, 2015)

So dyes are really out of the question here? LOL!

Delidding is ridiculous for 90% of enthusiasts, this just takes it to another level.


----------



## P4-630 (Oct 6, 2015)

I would never delid a CPU, just lapping it.


----------



## buildzoid (Oct 6, 2015)

I might try it for the hell of it with a cheap AMD APU. However I have serious doubts that we will see this in use any time soon since even regular water cooling ATM is not necessary. Also what would you use to cut micro channels into silicon a small diamond saw?


----------



## cdawall (Oct 6, 2015)

60% over a standard air coolers?







That's a standard air cooler, so sweet maybe one day it will perform as well as any good air cooler.


----------



## lilhasselhoffer (Oct 6, 2015)

buildzoid said:


> I might try it for the hell of it with a cheap AMD APU. However I have serious doubts that we will see this in use any time soon since even regular water cooling ATM is not necessary. Also what would you use to cut micro channels into silicon a small diamond saw?



The article is less than specific.  Off hand, I'd conjecture either a laser or acid.  Acids being much cheaper, but a laser being more precise.



cdawall said:


> 60% over a standard air coolers?
> 
> 
> 
> That's a standard air cooler, so sweet maybe one day it will perform as well as any good air cooler.



I don't think you're looking at this apples to apples.  

By a standard air cooler they mean a chunk of metal (likely aluminum, to inflate their results) with air flowing over the fins.  They didn't cite a beastly air cooler, or what I think you'd call a "good" air cooler.  Realistically, they worked with an Altera FPGA, not an Intel designed CPU.  In short, you're right, but a bit out of the accurate comparison ballpark.


----------



## FireFox (Oct 6, 2015)

P4-630 said:


> I would never delid a CPU


Maybe you should give it a try


----------



## cdawall (Oct 6, 2015)

I love the thought of improvement,  but how about instead of cooling better we lower wattage and increase performance instead? Seems like a win win situation.


----------



## thesmokingman (Oct 6, 2015)

lilhasselhoffer said:


> So, I found the following bit of experimental information:
> http://www.news.gatech.edu/2015/10/05/liquid-cooling-moves-chip-denser-electronics
> 
> The goal was to improve water cooling technology, and make it more efficient.  They took a silicon die, cut micro channels into the back of it, and pumped deionized water through the micro channels.
> ...




IBM has been doing this for *years*, albeit much more refined. Article is dated 2008. 

http://www.cnet.com/news/ibm-to-cool-layered-chips-with-water/


"To address that problem, the team has developed a cooling system consisting of micropipes of water as thin as a human hair (50 microns) that are interspersed between each chip layer.

To prevent an electrical short, the hairlike water pipes are hermetically sealed from the chip's other components first with a silicon wall and then with a layer of silicon oxide, according to Brunschwiler."


Also, they showed their Brain inspired cooling recently. This stuff is crazzzzy.


http://www.research.ibm.com/cognitive-computing/brainpower/








http://www.dailymail.co.uk/sciencet...in-inspired-runs-electrolyte-rich-liquid.html


----------



## lilhasselhoffer (Oct 6, 2015)

thesmokingman said:


> IBM has been doing this for *years*, albeit much more refined. Article is dated 2008.
> 
> http://www.cnet.com/news/ibm-to-cool-layered-chips-with-water/
> 
> ...



Only a few problems.
1) The cited article is about chips designed with water pipes flowing through them.  It requires expensive design changes, special solder, and is predicated upon a radical shift in CPU design.
2) The article was from 2008, and cites penetration in 5-10 years.  It has been 7, but nothing indicates we'll be seeing this any time soon.  Perhaps a dead technology?  Perhaps too expensive?  No answers are available, but it definitely isn't hitting any time soon.
3) Neural chip design doesn't exactly mesh with this at all.  While interesting, it's off topic.



We're looking at tech that we could see tomorrow, and you're citing tech that would need years to come out.  If I'd titled the thing "the future of cooling" you'd have beaten me to the punch.  As it stands, this is interesting but not yet applicable.


----------



## thesmokingman (Oct 6, 2015)

lilhasselhoffer said:


> Only a few problems.
> 1) The cited article is about chips designed with water pipes flowing through them.  It requires expensive design changes, special solder, and is predicated upon a radical shift in CPU design.
> 2) The article was from 2008, and cites penetration in 5-10 years.  It has been 7, but nothing indicates we'll be seeing this any time soon.  Perhaps a dead technology?  Perhaps too expensive?  No answers are available, but it definitely isn't hitting any time soon.
> 3) Neural chip design doesn't exactly mesh with this at all.  While interesting, it's off topic.
> ...




They already have in chip watercooling and like I wrote much more refined. It is somewhat old news. Why you no google? Btw Zurich is for the Zurich supercomputer.

http://www.wired.com/2013/01/ibm-waterworld/

"A model 3-D liquid cooled chip built by IBM researchers in Zurich. The 3-D chips use miniscule cooling channels to dissipate the intense amount of heat generated by these chips, which stack lots of processor cores in a very small amount of space."


----------



## lilhasselhoffer (Oct 6, 2015)

thesmokingman said:


> They already have in chip watercooling and like I wrote much more refined. It is somewhat old news. Why you no google? Btw Zurich is for the Zurich supercomputer.
> 
> http://www.wired.com/2013/01/ibm-waterworld/
> 
> "A model 3-D liquid cooled chip built by IBM researchers in Zurich. The 3-D chips use miniscule cooling channels to dissipate the intense amount of heat generated by these chips, which stack lots of processor cores in a very small amount of space."



You seem to be conflating miniscule and microscopic.


The articles you've linked to demonstrate a variety of fun theories.  If you could solder a system of pipes into a die you'd have an extremely effective cooling device, because you've got virtually no distance between heat source and heat sink.  That would require a substantial redesign of chips, much more expensive construction, and despite being introduced 7 years ago has made absolutely no headway.

The next is a completely unrelated article, about neural chips.  Interesting, but it doesn't relate to cooling.

Moving on, you've got macro cooler design.  IBM making the cooling structure of servers is interesting, but that doesn't really concern most people.  Running the water temperature higher is interesting, but again it doesn't influence anything consumer related.  Additionally, designing smaller pipes for conventional liquid coolers (what you've linked to most recently) is cool, but again isn't what the article I cited was about.



Perhaps a review.  A standard FPGA was stripped of any IHS, micro channels were cut into the encapsulating material (this is something that can be done with any chip, not just something IBM designed for it), the channels were sealed, and water was pumped through them.  They demonstrated that such a cooling mechanism could cool substantially better than air coolers, and was adaptable to a chip that didn't require a substantial redesign.

Everything you've cited requires a design from the ground up to be effective.  You have to design pipes inside the CPU, or have to design a complete server cooling system, or worse yet you've got to design the whole processor and mounting to both allow fluid and be completely water tight.  All of these things are interesting, but have absolutely no use unless you spend a boat load of money and redesign chips from the ground up. 


To be sure, micro-fluidics isn't a new science.  We've had breakthroughs that use microchannels to pump fluids without a pump, make things both hydroscopic and hydrophobic, induce boiling well below boiling point, and a bunch of other things.  I cannot find any credible example where someone went to the trouble of utilizing an existing CPU and testing if the results could be practically duplicated.  Based upon your research, you haven't either.  I didn't post this because micro-fluidics is something new, but it's demonstrable proof that theory has practical applications that can be realized today.


----------



## cdawall (Oct 6, 2015)

The server industry isn't microscopic...


----------



## thesmokingman (Oct 6, 2015)

He seems to have a problem with the concept of it's been done already? From the OP:

"To make their liquid cooling system, Bakir and graduate student Thomas Sarvey removed the heat sink and heat-spreading materials from the backs of stock Altera FPGA chips. They then etched cooling passages into the silicon, incorporating silicon cylinders approximately 100 microns in diameter to improve heat transmission into the liquid. A silicon layer was then placed over the flow passages, and ports were attached for the connection of water tubes."

The above is like totally different from the below.

"To address that problem, the team has developed a cooling system consisting of micropipes of water as thin as a human hair (50 microns) that are interspersed between each chip layer."

Right, like I wrote. IBM had already broken this ground years before. The OP seems to think this is something that will help consumers like tomorrow? Who's gonna etch 100micron diameter silicon coated passages into their i7's anytime soon?


----------



## cdawall (Oct 6, 2015)

No one if they want extreme cooling they will simply run LN2.


----------



## thesmokingman (Oct 6, 2015)

cdawall said:


> No one if they want extreme cooling they will simply run LN2.




Concur. Though even that is a gigantor hassle.


----------



## cdawall (Oct 6, 2015)

thesmokingman said:


> Concur. Though even that is a gigantor hassle.



Peltier cooling is another option that is guaranteed to get lower temps than any above ambient water setup.


----------



## thesmokingman (Oct 6, 2015)

cdawall said:


> Peltier cooling is another option that is guaranteed to get lower temps than any above ambient water setup.




Oh yea, especially that setup with the ambient control. I can't remember the name of the guy that made that block atm though.


----------



## silentbogo (Oct 6, 2015)

cdawall said:


> Peltier cooling is another option that is guaranteed to get lower temps than any above ambient water setup.


Not really. Back in a day, when TEC coolers appeared on retail market were a new thing I looked at some reviews and almost all of these devices fell a little short of high-performance air cooling, while  pricewise were in a ballpark of a mediocre H2O rig.
Another big problem of peltier elements is efficiency. If, for example, you have an element that can dissipate 60W of power, it will also generate more than 60W of heat on the hot side. In order to prevent burning it up you either need a gigantic HSF assembly, or a watercooling rig, which pretty much kills the whole purpose of using peltier element.


----------



## buildzoid (Oct 6, 2015)

cdawall said:


> Peltier cooling is another option that is guaranteed to get lower temps than any above ambient water setup.



Peltiers need low TDPs to work well. I tried building a 400W TEC chiller and the amount of heat that the TECs generate is insane even when I run the loop with no load. With a heatsink upgrade I might be able to go sub 0 water temp but it's expensive. If you want to put a TEC directly on the CPU there are some 437W Qmax TECs that pull 550W and are 62mm square. They pack enough punch to subzero a 4 core i7 however you need custom water blocks to cool them. I'm thinking of Try a TEC on a 2core AMD APU with just an air cooler but I think it won't lead to anything. ATM the best use for TEC cooling is if you want to cram sub zero into an small case and don't care about the costs. Then you just build 2 water loops 1 to cool the TECs and 1 to cool the CPU. And even then the RADs you need are a problem.



silentbogo said:


> Not really. Back in a day, when TEC coolers appeared on retail market were a new thing I looked at some reviews and almost all of these devices fell a little short of high-performance air cooling, while  pricewise were in a ballpark of a mediocre H2O rig.
> Another big problem of peltier elements is efficiency. If, for example, you have an element that can dissipate 60W of power, it will also generate more than 60W of heat on the hot side. In order to prevent burning it up you either need a gigantic HSF assembly, or a watercooling rig, which pretty much kills the whole purpose of using peltier element.



If your goal is a compact subzero cooler peltiers are still your best bet if you don't mind the stupid costs of building 2 water loops just to cool your CPU.


----------



## silentbogo (Oct 6, 2015)

buildzoid said:


> If your goal is a compact subzero cooler peltiers are still your best bet if you don't mind the stupid costs of building 2 water loops just to cool your CPU.


When I was still curious with overclocking that was my first idea - make 2 isolated water loops. But the cost and size was so ridiculous, the project died on the drafting board. 

I still have a couple of small 40W elements laying around (bought for other purposes), which might be good enough for cooling something like my new Pentium G3250, but I doubt I will go sub-zero on that one. Too afraid of condensation.


----------



## lilhasselhoffer (Oct 6, 2015)

cdawall said:


> The server industry isn't microscopic...



Allow me to answer, because you've not read the articles linked to obviously.

The articles cite miniscule tubes of water.  These aren't micron sized tubes, but 1/16" or less sizes (really, take a look at the links which so few people seem to ever actually read).  That isn't anywhere near the same scale as my original citation.  If 100 micron tubes are inside the channel, then it isn't comparing the same standards of small.  This is why I said there was a discrepancy.  Heck, 100 micron tubes inside the IBM piping (the final article, not the human hair one) would likely be considered acceptable debris.

You somehow taking this to mean the server industry is microscopic is an unfathomable leap of logic.  I didn't say anything like it, only that the server industry application of cooling design don't in any way relate to my linked article.




thesmokingman said:


> He seems to have a problem with the concept of it's been done already? From the OP:
> 
> "To make their liquid cooling system, Bakir and graduate student Thomas Sarvey removed the heat sink and heat-spreading materials from the backs of stock Altera FPGA chips. They then etched cooling passages into the silicon, incorporating silicon cylinders approximately 100 microns in diameter to improve heat transmission into the liquid. A silicon layer was then placed over the flow passages, and ports were attached for the connection of water tubes."
> 
> ...



That's not the point, and you know it.

I could etch those channels today.  These people took an off the shelf product, and demonstrated what they could do.  How long before you could demonstrate the water pipes on an i7, if you could ever do it?  

IBM spends millions, if not billions, of dollars on a prototype that will never see any practical uses.  Do you not understand the vast difference there?  I'm not saying the same thing as any of the articles you linked to, which suggest that in a decade some of this could be useful, I'm linking to someone who demonstrated useful techniques that could be done tomorrow, without millions of dollars of investment.

By your logic, if Intel demonstrated a one part production run of 4 nm lithography tomorrow the whole 14 nm and 10 nm lithography process they're breaking ground on shortly would be completely out of date.  Yeah, there's no practical use for 4 nm yet, but because it exists in a think tank every consumer should demand its adoption tomorrow.  Seems silly, as we need both things.  We need research to push boundaries, but research without practical application is a useless waste of money.


----------



## thesmokingman (Oct 6, 2015)

lilhasselhoffer said:


> Allow me to answer, because you've not read the articles linked to obviously.
> 
> The articles cite miniscule tubes of water.  These aren't micron sized tubes, but 1/16" or less sizes (really, take a look at the links which so few people seem to ever actually read).  That isn't anywhere near the same scale as my original citation.  If 100 micron tubes are inside the channel, then it isn't comparing the same standards of small.  This is why I said there was a discrepancy.  Heck, 100 micron tubes inside the IBM piping (the final article, not the human hair one) would likely be considered acceptable debris.
> 
> ...




What is your point? From where I'm standing you seem to only want to argue? You must be right about something because you won't stop going on about whatever it is?

This is rather comical though. IBM does this a decade before...


----------



## lilhasselhoffer (Oct 6, 2015)

thesmokingman said:


> What is your point? From where I'm standing you seem to only want to argue? You must be right about something because you won't stop going on about whatever it is?
> 
> This is rather comical though. IBM does this a decade before...


 
My point is practical application.

You want to be pedantic, you're wrong.  Microfluidics has been a topic for much longer than IBM has been dipping their toes in.  Less than a decade of research by IBM means nothing.  Want to quote google, better prepare for the academic slap you've been trying to give me: Tabeling, P (2006). _Introduction to Microfluidics_. Oxford University Press. ISBN 0-19-856864-9.


I'm arguing because you fail to recognize the difference between practical application, and theoretical.  I could do what this paper cited tomorrow, with any chip on the market.  Give me a year, and I couldn't do what IBM did.  That's the difference.


----------



## thesmokingman (Oct 6, 2015)




----------



## lilhasselhoffer (Oct 6, 2015)

thesmokingman said:


>



I'm going to just assume you're trolling and stop responding to you then.


Edit:
Kudos, I took the bait.


----------



## cdawall (Oct 7, 2015)

silentbogo said:


> Not really. Back in a day, when TEC coolers appeared on retail market were a new thing I looked at some reviews and almost all of these devices fell a little short of high-performance air cooling, while  pricewise were in a ballpark of a mediocre H2O rig.
> Another big problem of peltier elements is efficiency. If, for example, you have an element that can dissipate 60W of power, it will also generate more than 60W of heat on the hot side. In order to prevent burning it up you either need a gigantic HSF assembly, or a watercooling rig, which pretty much kills the whole purpose of using peltier element.



Even the small 92w unit I have modified my v10 to use offered a good performance upgrade. It's a secondary cooler on the main, but works the same way. If you are against pelts run a phase change.


----------



## thesmokingman (Oct 7, 2015)

^^Would dig a quadtec setup like Mindchills, it's so cool.


----------

