Monday, July 3rd 2023

Inflection AI Builds Supercomputer with 22,000 NVIDIA H100 GPUs

The AI hype continues to push hardware shipments, especially for servers with GPUs that are in very high demand. Another example is the latest feat of AI startup, Inflection AI. Building foundational AI models, the Inflection AI crew has secured an order of 22,000 NVIDIA H100 GPUs and built a supercomputer. Assuming a configuration of a single Intel Xeon CPU with eight GPUs, almost 700 four-node racks should go into the supercomputer. Scaling and connecting 22,000 GPUs is easier than it is to acquire them, as NVIDIA's H100 GPUs are selling out everywhere due to the enormous demand for AI applications both on and off premises.

Getting 22,000 H100 GPUs is the biggest challenge here, and Inflection AI managed to get them by having NVIDIA as an investor in the startup. The supercomputer is estimated to cost around one billion USD and consume 31 Mega-Watts of power. The Inflection AI startup is now valued at 1.5 billion USD at the time of writing.
Source: HardwareLuxx.de
Add your own comment

12 Comments on Inflection AI Builds Supercomputer with 22,000 NVIDIA H100 GPUs

#1
P4-630
AleksandarKGetting 22,000 H100 GPUs is the biggest challenge here
And building it I guess...
Posted on Reply
#2
Paganstomp
Just when we are getting over a GPU shortage...
Posted on Reply
#3
lemonadesoda
I don't understand the scaling assumption of 1:8.

If something Retail like this MSI B360-F PRO can do 18x, then a specialist bespoke xeon board could easily do many more. After all Intel Xeon W-3400 Series has 112 pcie lanes and could therefore run 112 GPUs, let's call it 100.
Posted on Reply
#4
Wirko
lemonadesodaI don't understand the scaling assumption of 1:8.

If something Retail like this MSI B360-F PRO can do 18x, then a specialist bespoke xeon board could easily do many more. After all Intel Xeon W-3400 Series has 112 pcie lanes and could therefore run 112 GPUs, let's call it 100.
"Can do"? For mining, sure, with an i3 CPU at that.
Here there are huge amounts of data to move from and to storage, and processing also takes place on the CPUs in part. Monster computing nodes with 8 GPU accelerators and twin Xeons or Epycs aren't uncommon. One variant of the MI300 is going to have as many as 24 CPU cores in the same package as the GPU, which will enable operation without a separate Epyc - and think about how much bandwidth those CPU cores need to communicate with the GPU part.
Posted on Reply
#5
Scrizz
lol I read that as "Infection AI." :laugh:
Posted on Reply
#6
dragontamer5788
The Inflection AI startup is now valued at 1.5 billion USD at the time of writing.
Assuming $10,000 per GPU, that's $220 Million on GPUs alone, let alone datacenter costs, CPU costs, RAM, hard drives...

A valuation of $1.5 Billion sounds fair because that's barely much more than the underlying hardware.
Posted on Reply
#8
Wirko
Minus InfinitySo AI has actually achieved an Inflection point.
Maybe we're lucky to have a limited amount of sand and electricity to produce chips, and of course a limited number of TSMCs who can print them.
Posted on Reply
#9
dragontamer5788
Minus InfinitySo AI has actually achieved an Inflection point.
An inflection point of venture capitalist money for sure.

For creative use: AI seems like it will be with us with Photoshop's Generative Fill: (www.adobe.com/products/photoshop/generative-fill.html). I'm not convinced text is quite ready yet, even with GPT4. ChatGPT / GPT4 is good enough to make very annoying spambots, but the content / hallucinations / lying is just awful and makes practical use of GPT4 just unworkable in many cases.
Posted on Reply
#10
skates
31 megawatts of juice required, that is enormous amount of power required and to what end? I wonder if adding this demand ups the cost for residential power.
Posted on Reply
#11
diatribe
dragontamer5788Assuming $10,000 per GPU, that's $220 Million on GPUs alone, let alone datacenter costs, CPU costs, RAM, hard drives...

A valuation of $1.5 Billion sounds fair because that's barely much more than the underlying hardware.
The H100's are going for $40,000 each!
Posted on Reply
#12
P4-630
diatribeThe H100's are going for $40,000 each!
But: The more you buy, the more you save!.... :D
Posted on Reply
Add your own comment
Dec 25th, 2024 08:17 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts