Friday, October 11th 2024
NVIDIA "Blackwell" GPUs are Sold Out for 12 Months, Customers Ordering in 100K GPU Quantities
NVIDIA's "Blackwell" series of GPUs, including B100, B200, and GB200, are reportedly sold out for 12 months or an entire year. This directly means that if a new customer is willing to order a new Blackwell GPU now, there is a 12-month waitlist to get that GPU. Analyst from Morgan Stanley Joe Moore confirmed that in a meeting with NVIDIA and its investors, NVIDIA executives confirmed that the demand for "Blackwell" is so great that there is a 12-month backlog to fulfill first before shipping to anyone else. We expect that this includes customers like Amazon, META, Microsoft, Google, Oracle, and others, who are ordering GPUs in insane quantities to keep up with the demand from their customers.
The previous generation of "Hopper" GPUs was ordered in 10s of thousands of GPUs, while this "Blackwell" generation was ordered in 100s of thousands of GPUs simultaneously. For NVIDIA, that is excellent news, as that demand is expected to continue. The only one standing in the way of customers is TSMC, which manufactures these GPUs as fast as possible to meet demand. NVIDIA is one of TSMC's largest customers, so wafer allocation at TSMC's facilities is only expected to grow. We are now officially in the era of the million-GPU data centers, and we can only question at what point this massive growth stops or if it will stop at all in the near future.
Source:
via Barron's (Paywall)
The previous generation of "Hopper" GPUs was ordered in 10s of thousands of GPUs, while this "Blackwell" generation was ordered in 100s of thousands of GPUs simultaneously. For NVIDIA, that is excellent news, as that demand is expected to continue. The only one standing in the way of customers is TSMC, which manufactures these GPUs as fast as possible to meet demand. NVIDIA is one of TSMC's largest customers, so wafer allocation at TSMC's facilities is only expected to grow. We are now officially in the era of the million-GPU data centers, and we can only question at what point this massive growth stops or if it will stop at all in the near future.
29 Comments on NVIDIA "Blackwell" GPUs are Sold Out for 12 Months, Customers Ordering in 100K GPU Quantities
"
$2 H100s: How the GPU Bubble Burst
Last year, H100s were $8/hr if you could get them. Today, there's 7 different resale markets selling them under $2. What happened?"
While hardware like the H100 may seem surreal to most of us, there is only one industry that's as greedy and demanding of the absolute latest no matter the cost as the AI industry is, and that'd be the crypto industry, both of which are buying these by the crate since time is quite literally money.
And btw, they are scheduled for delivery late next week, so I'm gonna be a real busy guy for the next few years doing upgrades for all my previous clients !
j/k.....hahahaha, made ya look :D
and FYI...mainframes are just slameframes in disguise, IMO...
Supply is entirely constrained by TSMC production capacity nowadays. They have a de facto monopoly on latest generation semiconductor fabrication
They will keep doing the "buy new" with every generation. The old GPUs still exist they arent going to land fills, they just keep them in the same rack and repurpose the workload. Training can happen a little slower so when a system is no longer meeting the performance metrics required for inference they shift the load and have the old ones train the models instead.
This happens industry wide; otherwise you would have just as many trucks leaving with equipment as going in with new. Which isnt the case; thats why all of them are expanding there DC presence. Square footage is the real race. Since there needs to be both training and inference they arent getting rid of the older systems, they arent even moving them, they just open a new data hall.
You wont see actual sunsetting until 3-5 years after release for these GPUs. H100 is certainly still being used.
Remember that an AI accelerator is not just one chip all by itself. There's the PCB and a bunch of other things on the board. Then the board needs to plug into something. It won't get data magically so it needs connectivity infrastructure. There are companies who make racks, power supplies, cooling systems, etc.
And data needs to be stored somewhere so all of the companies involved in solid state data storage will be making money: SK Hynix, Marvell, Samsung, etc.
Of course all of the chip design & process companies are enjoying the AI boom. Cadence, ASML, Lam Research, Applied Materials, Zeiss, KLA, etc.
It's not just Nvidia and their manufacturing partner. Naturally AMD and Intel are also making money from AI, just not as much as Nvidia.
Of course, the power company is making money too. And since the world has more than one power company, you can use the words "power companies." Plural.
Nvidia most certainly is not doing this by themselves.
Remember that AI really isn't a product. There are consumer focused AI-powered tools but most AI right now is being used in enterprise settings. There are companies like FedEx and Walmart who are using AI to remove bottlenecks in typical situations. Even in chip design, AI is being used for chip layout. What took a year manually by humans can now be done in a month or less using AI-powered chip layout tools.
Lots of consumers online are fixated on consumer facing AI solutions but I assure you that AI chatbots are a tiny fraction of what AI is already doing. From a consumer perspective, remember that kids (teens and people in their 20s) have been using AI chatbots for a couple of years to do homework, write term papers, etc.