Tuesday, July 30th 2024

NVIDIA Blackwell's High Power Consumption Drives Cooling Demands; Liquid Cooling Penetration Expected to Reach 10% by Late 2024

With the growing demand for high-speed computing, more effective cooling solutions for AI servers are gaining significant attention. TrendForce's latest report on AI servers reveals that NVIDIA is set to launch its next-generation Blackwell platform by the end of 2024. Major CSPs are expected to start building AI server data centers based on this new platform, potentially driving the penetration rate of liquid cooling solutions to 10%.

Air and liquid cooling systems to meet higher cooling demands
TrendForce reports that the NVIDIA Blackwell platform will officially launch in 2025, replacing the current Hopper platform and becoming the dominant solution for NVIDIA's high-end GPUs, accounting for nearly 83% of all high-end products. High-performance AI server models like the B200 and GB200 are designed for maximum efficiency, with individual GPUs consuming over 1,000 W. HGX models will house 8 GPUs each, while NVL models will support 36 or 72 GPUs per rack, significantly boosting the growth of the liquid cooling supply chain for AI servers.
TrendForce highlights the increasing TDP of server chips, with the B200 chip's TDP reaching 1,000 W, making traditional air cooling solutions inadequate. The TDP of the GB200 NVL36 and NVL72 complete rack systems is projected to reach 70kW and nearly 140kW, respectively, necessitating advanced liquid cooling solutions for effective heat management.

TrendForce observes that the GB200 NVL36 architecture will initially utilize a combination of air and liquid cooling solutions, while the NVL72, due to higher cooling demands, will primarily employ liquid cooling.

TrendForce identifies five major components in the current liquid cooling supply chain for GB200 rack systems: cold plates, coolant distribution units (CDUs), manifolds, quick disconnects (QDs), and rear door heat exchangers (RDHx).

The CDU is the critical system responsible for regulating coolant flow to maintain rack temperatures within the designated TDP range, preventing component damage. Vertiv is currently the main CDU supplier for NVIDIA AI solutions, with Chicony, Auras, Delta, and CoolIT undergoing continuous testing.

GB200 shipments expected to reach 60,000 units in 2025, making Blackwell the mainstream platform and accounting for over 80% of NVIDIA's high-end GPUs
In 2025, NVIDIA will target CSPs and enterprise customers with diverse AI server configurations, including the HGX, GB200 Rack, and MGX, with expected shipment ratios of 5:4:1. The HGX platform will seamlessly transition from the existing Hopper platform, enabling CSPs and large enterprise customers to adopt it quickly. The GB200 rack AI server solution will primarily target the hyperscale CSP market. TrendForce predicts NVIDIA will introduce the NVL36 configuration at the end of 2024 to quickly enter the market, with the more complex NVL72 expected to launch in 2025.

TrendForce forecasts that in 2025, GB200 NVL36 shipments will reach 60,000 racks, with Blackwell GPU usage between 2.1 to 2.2 million units.

However, there are several variables in the adoption of the GB200 Rack by end customers. TrendForce points out that the NVL72's power consumption of around 140kW per rack requires sophisticated liquid cooling solutions, making it challenging. Additionally, liquid-cooled rack designs are more suitable for new CSP data centers but involve complex planning processes. CSPs might also avoid being tied to a single supplier's specifications and opt for HGX or MGX models with x86 CPU architectures, or expand their self-developed ASIC AI server infrastructure for lower costs or specific AI applications.
Source: TrendForce
Add your own comment

41 Comments on NVIDIA Blackwell's High Power Consumption Drives Cooling Demands; Liquid Cooling Penetration Expected to Reach 10% by Late 2024

#1
Klemc
The global climate is getting hotter and hotter...

It's time for PCs to also participate in this phenomenon !
Posted on Reply
#2
Daven
It’s really going to be up to us as consumers to reject these high power, expensive, obscenely high margin parts.

Our world will literally die the moment humans turn on millions of 1000W GPUs in order just to see better virtual water and light reflections in games and CGI films.
Posted on Reply
#3
bonehead123
KlemcIt's time for PCs to also participate in this phenomenon !
According to Ngreediya, this is already happening, and will only serve to increase prices on cooling parts overall, which will make them happy as this will give them yet ANUTHA reason to jack up GPU prices :(
Posted on Reply
#4
R0H1T
DavenIt’s really going to be up to us as consumers to reject these high power, expensive, obscenely high margin parts.
If you look at the top parts in that leaked(?) image it shows a DC part, it's up to our corporate overlords to reject this "AI" BS now but that's obviously not happening :shadedshu:
Posted on Reply
#5
Zazigalka
If there is one thing that I despise that company for, it's not the proprietary stuff that made its way into gaming, cause that's what usually drives the competition to develop the simpler but open version of it. It's the introduction of those high power GPUs for the purpose of crypto mining and data center.
DavenIt’s really going to be up to us as consumers to reject these high power, expensive, obscenely high margin parts.

Our world will literally die the moment humans turn on millions of 1000W GPUs in order just to see better virtual water and light reflections in games and CGI films.
well said. ain't happenning though, people will buy 500w 5090s like crazy just to have the best. 4K RT on isn't enough apparently.
Posted on Reply
#6
Klemc
ZazigalkaIf there is one thing that I despise that company for, it's not the proprietary stuff that made its way into gaming, cause that's what usually drives the competition to develop the simpler but open version of it. It's the introduction of those high power GPUs for the purpose of crypto mining and data center.
If this uses RTX cores (tensor thingy) then it's getting worst and worst
Posted on Reply
#7
Philaphlous
how the heck do they think you can get 1000W+ on 12V rails.... they're obviously going to have to scale up the voltage. I'm already concerned how they're able to pump that much watts into a GPU...imagine the Amps from those MOSFETS. So what if its like 1.4V but you're talking like 300-400 amps+. 2700W is nuts... that's 225Amps @ 12V. Do they realize you'd need like 0/1 gauge wire for the GPU... something seems fishy here...
Posted on Reply
#8
Zazigalka
KlemcIf this uses RTX cores (tensor thingy) then it's getting worst and worst
I think tensor/rt cores are actually doing more good than bad, being able to accelerate very specific workloads at the fraction of power that CUDA cores would require. The problem is nvidia going to downright stupid lenghts in order to provide the best datacenter GPUs regardles of the crazy amount of power they require. I can't quote the exact number or article, but I remember reading something that said keeping ChatGPT and similar AI services operational requires more power daily than a few moderately sized countries need.
Posted on Reply
#9
fevgatos
ZazigalkaIf there is one thing that I despise that company for, it's not the proprietary stuff that made its way into gaming, cause that's what usually drives the competition to develop the simpler but open version of it. It's the introduction of those high power GPUs for the purpose of crypto mining and data center.

well said. ain't happenning though, people will buy 500w 5090s like crazy just to have the best. 4K RT on isn't enough apparently.
Just limit it? At some point we need to realize the 5090,just like the 4090, are the most efficient chips, thus the best at "protecting the environment".

Of course people will buy 5090s like crazy. Why wouldn't they? As ive said, even people that care about effieicny and power draw will buy 5090s cause they are the fastest at any power level.
Posted on Reply
#10
phints
Eyes that GB200 with 384GB VRAM and 2.7kW power for PUBG... just need an industrial 480VAC feeder to power it.
Posted on Reply
#11
TheinsanegamerN
DavenIt’s really going to be up to us as consumers to reject these high power, expensive, obscenely high margin parts.
If you dont want it, dont buy it? There's clearly plenty of us willing to pay for powerful GPUs. That's ALWAYS been the case. And people have winged and moaned about high power expensive GPUs since the dawn of PCIe.
DavenOur world will literally die the moment humans turn on millions of 1000W GPUs in order just to see better virtual water and light reflections in games and CGI films.
KlemcThe global climate is getting hotter and hotter...

It's time for PCs to also participate in this phenomenon !
It's time you DID something about it! Have you abandoned your PCs and returned to an agrarian society? Oh, no, you have not. Better get on it, you got a planet to save!
Posted on Reply
#12
Jonny5isalivetm5
Pretty sure the 2700watt monster will not be for consumer PCs but lets see XD
Posted on Reply
#13
TheinsanegamerN
Philaphloushow the heck do they think you can get 1000W+ on 12V rails.... they're obviously going to have to scale up the voltage. I'm already concerned how they're able to pump that much watts into a GPU...imagine the Amps from those MOSFETS. So what if its like 1.4V but you're talking like 300-400 amps+. 2700W is nuts... that's 225Amps @ 12V. Do they realize you'd need like 0/1 gauge wire for the GPU... something seems fishy here...
They're not pushing 225 amps on one 12v line. Have you ever noticed that connectors have multiple wires? There are, in fact SIX 12v lines on the ATX 3.0 connector. SIX. So divide that 225 amp by 6 and you get 37.5 per line. But wait, there's more!

A 1000w GPU is going to use 2 ATX 3.0 connectors. Split that power among the 12 12v lines, and that's just 6.94 amps per line. That's right around the power level of the old 8 pin connectors, per line.

400 amps at 1.4v is only 560 watt. We've had GPUs push more then that already. That 2700w card is going to be server only, likely using non standard connectors to push that kind of power. It's not unusual either, high power cards have existed for a LONG time. Wouldnt surprise me if they went with 24v for those.
Posted on Reply
#14
Klemc
TheinsanegamerNIt's time you DID something about it! Have you abandoned your PCs and returned to an agrarian society? Oh, no, you have not. Better get on it, you got a planet to save!
I quitted France from 2010 to 2020... but living in Madagascar didn't made me less polutionist, and health goes down fast as hell too, there (didn't you know ?).

Also, my old uncle (dead) is the main builder of the ECOLOGISTS (politics) in France.
fr.wikipedia.org/wiki/Bernard_Charbonneau
Posted on Reply
#15
Zazigalka
TheinsanegamerNHave you abandoned your PCs and returned to an agrarian society? Oh, no, you have not. Better get on it, you got a planet to save!
No, but I've made a transition to using a zen4 laptop (7730U+Vega) as my main daily work/news/video PC, and switching the desktop on just for gaming sessions.
I also swapped the 3080 I had for 4070 Super just for desktop/gaming power reduction.
Posted on Reply
#16
Daven
TheinsanegamerNIt's time you DID something about it! Have you abandoned your PCs and returned to an agrarian society? Oh, no, you have not. Better get on it, you got a planet to save!
Since you don’t know me, I’m 48 with no children, vegetarian for 20 years, walked to work for the last 15 years, my wife and I have mostly lived within 5 miles of work, own one small sedan that sits in the driveway, never owned a truck and built mostly SFF PCs with 65w CPUs and 150W GPUs.

My carbon foot print is extremely small for an American and I sleep fine at night. So I can easily say we need to reject such high power computer parts as much as we can. Oh and I was a political activist in my youth fighting for the environment.
Posted on Reply
#17
londiste
ZazigalkaThe problem is nvidia going to downright stupid lenghts in order to provide the best datacenter GPUs regardles of the crazy amount of power they require.
It is pretty simple with DCs - efficiency is king. If that 2700W thing does more work per power unit than competitors, it gets used.
Posted on Reply
#18
SIGSEGV
No problem, and nothing to worry about.
Your believers still want to buy even if your products draw 10kilo watts.
Posted on Reply
#19
TheDeeGee
bonehead123According to Ngreediya, this is already happening, and will only serve to increase prices on cooling parts overall, which will make them happy as this will give them yet ANUTHA reason to jack up GPU prices :(
Who is Ngreediya?

I only know AlwaysMuchDaft.
Posted on Reply
#20
fevgatos
londisteIt is pretty simple with DCs - efficiency is king. If that 2700W thing does more work per power unit than competitors, it gets used.
Yeah, it's sad on a tech forum people don't understand the difference between power draw and efficiency. They see nvidia on the title and go berserk.
Posted on Reply
#21
napata
DavenIt’s really going to be up to us as consumers to reject these high power, expensive, obscenely high margin parts.

Our world will literally die the moment humans turn on millions of 1000W GPUs in order just to see better virtual water and light reflections in games and CGI films.
This is all B2B so it's not like consumers can do anything about it. These get sold by the rack for millions of dollars. It's even in the article how a single NVL72 rack draws 140KW and these run 24/7 most likely unlike consumer GPUs. That's significantly more power than I use for everything combined and I own a 4090.
Posted on Reply
#22
evernessince
The sooner faster ASICs come to the market for AI the better. This kind of power consumption is nowhere near where it needs to be. The human brain uses a mere 20 w of power and crunches a theoretical exaflop of data per second according to the NIST.
TheinsanegamerNIt's time you DID something about it! Have you abandoned your PCs and returned to an agrarian society? Oh, no, you have not. Better get on it, you got a planet to save!
It would be counter-intuitive to return to an agrarian society given that single persons / families farming their own food would be very inefficient, especially compared to today's farms that use GPS and precision systems to vastly improve efficiency and reduce waste. Stopping climate change isn't about halting all carbon output, it's going to be a combination of efficiency improvements, green energy, and carbon capture. Most people will not have to make significant changes to their life-style, it will be a matter of more efficient cars, hot water tanks, ect in combination with carbon capture tech. This is why people call on business to make changes, as they have a lot of carbon output from their activities and the products they produce have a downstream impact on customers as well.

There's no purity requirements to lodge a pro-environment argument or any argument for that matter. This is just a typical logical fallacy employed to discount a person's opinion without actually providing any substance against their argument.
Posted on Reply
#23
R0H1T
evernessinceThe human brain uses a mere 20 w of power and crunches a theoretical exaflop of data per second according to the NIST.
We should probably start growing more brains, as they say win-win :laugh:
Posted on Reply
#24
MaMoo
evernessinceThe sooner faster ASICs come to the market for AI the better. This kind of power consumption is nowhere near where it needs to be. The human brain uses a mere 20 w of power and crunches a theoretical exaflop of data per second according to the NIST.



It would be counter-intuitive to return to an agrarian society given that single persons / families farming their own food would be very inefficient, especially compared to today's farms that use GPS and precision systems to vastly improve efficiency and reduce waste. Stopping climate change isn't about halting all carbon output, it's going to be a combination of efficiency improvements, green energy, and carbon capture. Most people will not have to make significant changes to their life-style, it will be a matter of more efficient cars, hot water tanks, ect in combination with carbon capture tech. This is why people call on business to make changes, as they have a lot of carbon output from their activities and the products they produce have a downstream impact on customers as well.

There's no purity requirements to lodge a pro-environment argument or any argument for that matter. This is just a typical logical fallacy employed to discount a person's opinion without actually providing any substance against their argument.
Good points. I will just point out that regardless of ASICs, the current generation of AI algorithms are very inefficient compared to human learning. We generalize and extrapolate far more efficiently, for example, we don't need so much training data to identify animals and can combine both reductive analysis and systems methods to make inferences. I think this current generation of AI that is typical of the third renaissance of AI is going to need a few more renaissances to get to become more efficient, aside from ASICs.
Posted on Reply
#25
Klemc
Why the name's Blackwell, any source ?
Posted on Reply
Add your own comment
Jul 31st, 2024 11:18 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts