Wednesday, November 8th 2023

Intel Shuts Down its Cryo Cooling Technology Development

According to @momomo_us, Intel has discontinued its Cryo Cooling Technology as of July 1, 2023, marking the end of one of the tech industry's few sub-ambient cooling options. The technology, which could chill CPUs to 0 degrees Celsius to enhance performance, accompanied Intel's processors from the 10th-generation Comet Lake to the 13th-generation Raptor Lake. Despite its innovative approach to boosting CPU performance, the cooling solution was not widely embraced. The discontinuation comes just before the arrival of the 14th Generation Raptor Lake Refresh, which will not support the Cryo Cooling tech. Intel plans to maintain updates for the existing Cryo Cooling hardware until December 31, 2023.

This specialized cooling method did see some use in products like the Cooler Master MasterLiquid ML360 Sub-Zero and the EKWB EK-QuantumX Delta TEC waterblocks. Interestingly, the technology has managed to work even with non-Intel CPUs, which famous overclocker der8auer managed to get up and running on AMD's Ryzen 9 5950X. Some modifications were in place, but it was possible to do so. The likely reason for shutting down the cryo cooling project is the need for more financial sense to continue to pursue this technology and the effort to keep the cost of R&D down and make funds available for other projects at Intel's laboratories.
Sources: @momomo_us (X/Twitter), via Tom's Hardware, der8auer (Image)
Add your own comment

34 Comments on Intel Shuts Down its Cryo Cooling Technology Development

#26
Waldorf
@trsttte
until the tec part stops working (for whatever reason), then you will see how its not a good idea to have it there.
Posted on Reply
#27
Chrispy_
sethmatrix7Their net income is down year over year over year over year. www.investing.com/equities/intel-corp-financial-summary

Losing apple wasn't good, and responding by making even more inefficient processors relative to their competition couldn't have helped.
Intel as still suffering from their failed 10nm node.

They didn't outsource to TSCM soon enough so we had generations of 14nm overstaying their welcome, and now that the 10nm is on its third revision 10 > 10ESF > "7" it's passable, but still only competitive with TSMC's 7nm from six years ago. Intel "4" will be Intel's first true 7nm process node and we get to see how well that works when Meteor Lake laptop reviews arrive at around Christmas/NY.

As much as I dislike Intel for anticompetitive practices, price-gouging, and generally holding back the industry, I do want them to succeed for multiple reasons - having a leading-edge node other than TSMC available to the market is good for competition and progress, especially if Intel are willing to sell wafer capacity to others, and of course having faster/more efficient desktop and laptops is very desirable. I will not shed a tear if 300W+ i9's become a thing of the past.
Posted on Reply
#28
trsttte
phanbueySo the reason they use a huge amount of power is if they're trying to cool something the pelt isn't really kitted to handle (like a 300W processor). -- if you use it say on a larger area on the side of the rad -- or somewhere else in the loop -- like separate area/block in the loop that's hooked up to a heatpipe cooler etc. -- essentially giving you another 100W or so of chiller effect when needed, and more surface area for cooling, and the ability to chill water on demand to keep it just above the dew point.

The problem with trying to cool a 300W cpu with a pelt is it will use a monsterous amount of power, and it will still suffer from the "shim" effect (where the transfer of double thermal paste makes it even less effective). As it gets saturated it does the pelt thing where it becomes basically useless.

Actually the CPUs that would benefit the most from a 125W pelt on the chip is a 7800X3D -- something that doesn't generate more than 90W of heat load but LOVES getting a colder contact plate.
The efficiency calculus is the same, if you add it to the radiator it will still consume a stupid ammount of power for the cooling it will provide even if you change the scale from 300W to 100W or 10W, the thing is that using it on the cpu it can work as a negative thermal resistence facilitating thermal transfer, using it on the rad why not just use a bigger rad?

And if your answer is for going sub ambient, just use a chiller that uses far less power
Posted on Reply
#29
phanbuey
trsttteThe efficiency calculus is the same, if you add it to the radiator it will still consume a stupid ammount of power for the cooling it will provide even if you change the scale from 300W to 100W or 10W, the thing is that using it on the cpu it can work as a negative thermal resistence facilitating thermal transfer, using it on the rad why not just use a bigger rad?

And if your answer is for going sub ambient, just use a chiller that uses far less power
Yeah at the end of the day - seems there's a real reason these have been around for as long as I can remember and there hasn't been a successful product out of them (including now).
Posted on Reply
#30
unwind-protect
In all seriousness. To get a clocking advantage you have to go way below zero (either F or C, ideally to -100). But if you go below water freezing point you have to deal with condensation, and that kills the fun for most people and most applications.
Posted on Reply
#31
Chrispy_
unwind-protectIn all seriousness. To get a clocking advantage you have to go way below zero (either F or C, ideally to -100). But if you go below water freezing point you have to deal with condensation, and that kills the fun for most people and most applications.
Going below ambient temperatures turns CPU cooling from a 'set-and-forget' affair once every year or two to something that requires constant, regular maintenance and checking almost every time you turn on the PC. If you live to tinker, then I guess it's fun - but it's not actually useful and all the speed benefits your 3% overclock advantage nets you are wasted ten times over in downtime and maintenance.
Posted on Reply
#32
ThrashZone
Hi,
With the price tags on the devices it's not surprising they never made it mainstream.
Posted on Reply
#33
DavidC1
AssimilatorThere's nothing wrong with cutting divisions that have zero chance of ever being profitable or useful, or are only tangentially related to your primary product focus. Far too many megacorporations have suffered the death of a thousand cuts from the impact of having far too many R&D departments that are mostly each someone's pet project. TEC's usefulness is inherently limited by the laws of physics and there's not much Intel can do to change those, so really the question that should be asked is why the company ever went down this path in the first place - and apparently someone in Intel management finally did.
And it goes all the way to the top - Patrick Gelsinger

These are decisions that should have been made years ago, but previous CEOs were all about chasing fads. They still are including AMD with the AI thing which I believe will have some sort of a crash, but drones and peltier CPU coolers are complete nonsense unlike with the AI push.

Peltier coolers not only use a lot of power on their own, but it's ridiculously hard to cool the thing itself, which is a key to get low temperature on the cold side. So the cooler side has to be absolutely massive. We may get better efficiency one day, but it's not today.

Hopefully his cleanup will do something significant for Intel like he did at VMWare. No more spurious purchases and divisions.
Posted on Reply
#34
57er6uigyuholn
Six_Times"was not widely embraced"

yet another great engineering technology ignored. sad
I'd have bought it for my AMD rig :confused:
Posted on Reply
Add your own comment
Jun 3rd, 2024 06:51 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts