• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Asetek to Offer Liquid Cooling Solutions for Data Centers

Joined
Dec 6, 2011
Messages
4,784 (1.00/day)
Location
Still on the East Side
Asetek Inc., the world's leading supplier of liquid cooling solutions for computers, announced today that it will offer a range of liquid cooling solutions to address a diverse set of cooling challenges facing modern HPC clusters data centers. These challenges range from reducing latency in high frequency trading applications, to increasing performance and density in servers, HPC clusters and data centers and decreasing the amount of energy used for cooling. All of these solutions utilize the company's proven CPU and GPU cooling technology which is commercially deployed today in hundreds of thousands of computers around the world.





The benefit of Asetek liquid cooling technology is its efficiency in removing heat directly from processors and moving that heat to an optimal place for transferring it to the environment. The success of Asetek's technology results from its reliability, affordability and suitability for installation in high volume computer manufacturing. Asetek data center liquid cooling solutions utilize this technology to provide three general levels of server cooling:

- Internal Loop Liquid Cooling enables the use of the fastest processors, including high wattage processors, in high density servers.

- Rack CDU Liquid Cooling removes processor and or GPU heat from rack servers and blades out of the data center without the use of traditional computer room air conditioners or water chillers, enabling extreme densities on server, rack and data center level. The strongest value proposition however, is that the solution uses free outside ambient air cooling allowing around 50% power savings on the data center cooling cost.

- Sealed Server Liquid Cooling removes all server heat from the data center via liquid; literally no air from inside the data center is used for server cooling. This solution enables high density with high performance processors and ambient room temperature server cooling.

"We have studied the server market and engaged with our customers. While much of what is written suggests that the problem of data center cooling is monolithic, we have discovered the need is for a diverse set of solutions to meet specific data center performance, density and efficiency objectives," said André Sloth Eriksen, Founder and CEO of Asetek. "Using proven Asetek technology to engineer a range of cooling solutions gives Asetek a unique ability to address the wide diversity of cooling challenges that exist in the HPC and data center market today."

Leveraging its existing CPU and GPU liquid cooling technology brings both proven reliability and high volume cost advantages to Asetek's data center liquid cooling solutions. Low pressure system design is a cornerstone of Asetek's proven reliability. Low pressure puts less stress on joints and connections, substantially reducing failures. The company's integrated pump-cold plate units for CPU and GPU cooling are well suited for supplying redundancy in servers with multiple processors. All liquid channels are helium integrity tested and all systems are liquid filled and sealed at the factory for its life time, eliminating the need for any liquid handling by the server OEM, or data center operator. Additional information on Asetek data center cooling solutions is available at http://www.asetek.com/markets/data-centers.

View at TechPowerUp Main Site
 

cdawall

where the hell are my stars
Joined
Jul 23, 2006
Messages
27,680 (4.11/day)
Location
Houston
System Name All the cores
Processor 2990WX
Motherboard Asrock X399M
Cooling CPU-XSPC RayStorm Neo, 2x240mm+360mm, D5PWM+140mL, GPU-2x360mm, 2xbyski, D4+D5+100mL
Memory 4x16GB G.Skill 3600
Video Card(s) (2) EVGA SC BLACK 1080Ti's
Storage 2x Samsung SM951 512GB, Samsung PM961 512GB
Display(s) Dell UP2414Q 3840X2160@60hz
Case Caselabs Mercury S5+pedestal
Audio Device(s) Fischer HA-02->Fischer FA-002W High edition/FA-003/Jubilate/FA-011 depending on my mood
Power Supply Seasonic Prime 1200w
Mouse Thermaltake Theron, Steam controller
Keyboard Keychron K8
Software W10P
Those look like some mean rads/fans. I also spot a MCP355 in that first photo.
 
Joined
Aug 7, 2008
Messages
5,739 (0.96/day)
Location
Wakefield, UK
Duct tape looks interesting...

Do want though.
 

cdawall

where the hell are my stars
Joined
Jul 23, 2006
Messages
27,680 (4.11/day)
Location
Houston
System Name All the cores
Processor 2990WX
Motherboard Asrock X399M
Cooling CPU-XSPC RayStorm Neo, 2x240mm+360mm, D5PWM+140mL, GPU-2x360mm, 2xbyski, D4+D5+100mL
Memory 4x16GB G.Skill 3600
Video Card(s) (2) EVGA SC BLACK 1080Ti's
Storage 2x Samsung SM951 512GB, Samsung PM961 512GB
Display(s) Dell UP2414Q 3840X2160@60hz
Case Caselabs Mercury S5+pedestal
Audio Device(s) Fischer HA-02->Fischer FA-002W High edition/FA-003/Jubilate/FA-011 depending on my mood
Power Supply Seasonic Prime 1200w
Mouse Thermaltake Theron, Steam controller
Keyboard Keychron K8
Software W10P

TiN

Joined
Aug 28, 2009
Messages
215 (0.04/day)
Location
USA
System Name selfmade caseless :D
Processor Intel ES
Motherboard EVGA X299 DARK ES :)
Cooling LN2
Memory G.SKILL DDR4 @ 3600MHz
Video Card(s) GTX Kingpin's
Storage Intel P4500 4TB
Display(s) Philips BDM4350
Case none
Audio Device(s) Creative X-Fi
Power Supply EVGA NEX 1500W's
Mouse None (trackball)
Keyboard Steelseries APEX RAW and Corsair K-something
Software W2k8 R2 64bit SP1, FreeBSD 12
Benchmark Scores http://www.hwbot.org/community/user/tin?oldstyle=true
Most of watercooling builds from modders look much better/cleaner :D
I would not install this into my servers, no way :D

One of main ideas of watercooling - to reduce noise, what's on server market not important at all :)
 
Joined
Nov 13, 2004
Messages
455 (0.06/day)
Location
Canada/quebec/Montreal
System Name Custom DIY
Processor Intel i7 2600K @ 4.8 Turbo 1.4v
Motherboard Asus P8Z68-V Pro 8801
Cooling XSPC RS240 + 120mm Rad/fan
Memory Corsair 1866 Vangence 9-10-9-27-2T
Video Card(s) 2X EVGA GTX570 SLI
Storage OCZ Revodrive 110GB + 2x1TB seagate
Display(s) ASUS MT276HE
Case CoolerMaster Sniper
Audio Device(s) X-Fi Titanium Fatal1ty Professional Series
Power Supply ANTEC TP-750 Blue 750Watts
Software win7 64
Benchmark Scores http://3dmark.com/3dm11/1347866
One of main ideas of watercooling - to reduce noise, what's on server market not important at all :)

thats very true lol, i have 3 server at iWeb and 2 at my office ... and they all are very noisy but i dont care ... iWeb is far from me and in my office they are in an isolated cooled room :p so i don't see the point of WC server's (i use WC for my gaming system)
 
Joined
May 4, 2011
Messages
24 (0.00/day)
Location
Korpi
System Name 15" rMBP, mid 2012 base model
Processor Intel Core i7-3615QM
Video Card(s) NVIDIA GeForce GT 650M
One of main ideas of watercooling - to reduce noise, what's on server market not important at all :)

Well, as you say, with servers noise is usually not an issue, as big iron really requires dedicated server rooms anyway. But I don't think noise reduction is what this is for.

I think the benefit is, that with water the temperature delta is much smaller compared to air cooling, so you could gain longer lifespans for hot components. This could be a huge benefit with mission critical hardware.

And you could conduct most of the heat out of the server room through a couple of water pipes, instead of huge air ducts, so that might help with server room placement issues. (edit: Take a look at the last picture, the blade chassis, and you get the idea.)
 
Joined
Apr 8, 2009
Messages
3,016 (0.52/day)
Location
vermont
System Name The wifes worst enemy
Processor i5-9600k
Motherboard Asrock z390 phantom gaming 4
Cooling water
Memory 16gb G.skill ripjaw DDR4 2400 4X4GB 15-15-15-35-2T
Video Card(s) Asrock 5600xt phantom gaming 6gb 14gb/s
Storage crucial M500 120GB SSD, Pny 256GB SSD, seagate 750GB, seagate 2TB HDD, WD blue 1TB 2.5" HDD
Display(s) 27 inch samsung @ 1080p but capable of much more ;)
Case Corsair AIR 540 Cube Mid tower
Audio Device(s) onboard
Power Supply EVGA GQ1000W MODULAR
Mouse generic for now
Keyboard generic for now
Software gotta love steam, origin etc etc
Benchmark Scores http://hwbot.org/user/philbrown_23/

TiN

Joined
Aug 28, 2009
Messages
215 (0.04/day)
Location
USA
System Name selfmade caseless :D
Processor Intel ES
Motherboard EVGA X299 DARK ES :)
Cooling LN2
Memory G.SKILL DDR4 @ 3600MHz
Video Card(s) GTX Kingpin's
Storage Intel P4500 4TB
Display(s) Philips BDM4350
Case none
Audio Device(s) Creative X-Fi
Power Supply EVGA NEX 1500W's
Mouse None (trackball)
Keyboard Steelseries APEX RAW and Corsair K-something
Software W2k8 R2 64bit SP1, FreeBSD 12
Benchmark Scores http://www.hwbot.org/community/user/tin?oldstyle=true
lauri_hoefs

Before thinking about watercooling in mission critical servers - answer question - what's temperatures we are talking about if both watercooling loop and heatsink as populated in same chassis, but big bulky rads occupy valuable space in these Asetek designs. Also for MC tasks service/maintenance time must be minimized. In case of FAN failure - few mins to replace fan, most servers don't need even reboot because fans usually hotswap. Even if CPU fan fail, MC servers may allow to replace CPU/cooling hotswap. What's about watercooling loop? :)

Understand me right, I love Asetek WB's and phasechange units , but servers....different story :)
 

brandonwh64

Addicted to Bacon and StarCrunches!!!
Joined
Sep 6, 2009
Messages
19,542 (3.49/day)
Most servers are in an air conditioned rooms. The thought if this would not really be usefull to major company's that own such rooms. Our server rooms are cooled to 65F deg 24/7 a simple air cooler would suffice.
 
Joined
May 4, 2011
Messages
24 (0.00/day)
Location
Korpi
System Name 15" rMBP, mid 2012 base model
Processor Intel Core i7-3615QM
Video Card(s) NVIDIA GeForce GT 650M
lauri_hoefs
Understand me right, I love Asetek WB's and phasechange units , but servers....different story :)

I see what you mean, but I don't think this would make replacing fans or other parts any harder or slower, if designed well, of course. I don't think putting the rads inside the casings is a good idea, or that it would be a good idea to replace cooling systems in existing server rooms.

But the blades, they are a different story. In existing blades the fans are either hotswappable ones on the blade cards, or on the chassis. Why would it be different with water cooling? And if you look at the blade chassis, it looks like it's actually designed to be hotswappable and easy to maintain, but of course we can't be sure without any extra info.

Yes, I see several problems with the concept too, but I also see why this could be feasible.

Ideally, taking a little longer to replace a blade card could pay out if there was more uptime due to fewer component failures :)
 
Top