Friday, October 11th 2024

NVIDIA "Blackwell" GPUs are Sold Out for 12 Months, Customers Ordering in 100K GPU Quantities

NVIDIA's "Blackwell" series of GPUs, including B100, B200, and GB200, are reportedly sold out for 12 months or an entire year. This directly means that if a new customer is willing to order a new Blackwell GPU now, there is a 12-month waitlist to get that GPU. Analyst from Morgan Stanley Joe Moore confirmed that in a meeting with NVIDIA and its investors, NVIDIA executives confirmed that the demand for "Blackwell" is so great that there is a 12-month backlog to fulfill first before shipping to anyone else. We expect that this includes customers like Amazon, META, Microsoft, Google, Oracle, and others, who are ordering GPUs in insane quantities to keep up with the demand from their customers.

The previous generation of "Hopper" GPUs was ordered in 10s of thousands of GPUs, while this "Blackwell" generation was ordered in 100s of thousands of GPUs simultaneously. For NVIDIA, that is excellent news, as that demand is expected to continue. The only one standing in the way of customers is TSMC, which manufactures these GPUs as fast as possible to meet demand. NVIDIA is one of TSMC's largest customers, so wafer allocation at TSMC's facilities is only expected to grow. We are now officially in the era of the million-GPU data centers, and we can only question at what point this massive growth stops or if it will stop at all in the near future.
Source: via Barron's (Paywall)
Add your own comment

22 Comments on NVIDIA "Blackwell" GPUs are Sold Out for 12 Months, Customers Ordering in 100K GPU Quantities

#1
Daven
This leaves AMD Instinct for the littler guys to develop the next generation of AI and GPU compute. No one is going to stop developing and buying just because Nvidia is sold out.
Posted on Reply
#2
Bet0n
Meanwhile: www.latent.space/p/gpu-bubble
"

$2 H100s: How the GPU Bubble Burst

Last year, H100s were $8/hr if you could get them. Today, there's 7 different resale markets selling them under $2. What happened?"

Posted on Reply
#3
Dr. Dro
Bet0nMeanwhile: www.latent.space/p/gpu-bubble
"

$2 H100s: How the GPU Bubble Burst

Last year, H100s were $8/hr if you could get them. Today, there's 7 different resale markets selling them under $2. What happened?"

It's possible that H100s are now being retired due to much faster performing hardware being available. The Instinct MI300X is known to be better (as per Chips and Cheese testing), and the Blackwells are indubitably much faster as well.

While hardware like the H100 may seem surreal to most of us, there is only one industry that's as greedy and demanding of the absolute latest no matter the cost as the AI industry is, and that'd be the crypto industry, both of which are buying these by the crate since time is quite literally money.
Posted on Reply
#4
bonehead123
Wow, I'm soooo glad that I got my teeny, weenie, AI-buyer-bot-generated order of 10K units in before the rush from the big guys....so yee haw to me and boo hoo to all the suckas that waited :)

And btw, they are scheduled for delivery late next week, so I'm gonna be a real busy guy for the next few years doing upgrades for all my previous clients !




j/k.....hahahaha, made ya look :D
Posted on Reply
#5
bug
Good bye mining, hello AI...
Posted on Reply
#6
OSdevr
If the bubble bursts within the next year (quite likely) I wonder how many of these orders will be cancelled.
Posted on Reply
#7
64K
Looks like another scalper's paradise incoming.
Posted on Reply
#8
john_
Blackwell is two GPUs glued together, so it's like a 50% discount to Hopper, plus all the extra features. It makes sense to have much higher market success.
Posted on Reply
#9
Wirko
bonehead123Wow, I'm soooo glad that I got my teeny, weenie, AI-buyer-bot-generated order of 10K units in before the rush from the big guys....so yee haw to me and boo hoo to all the suckas that waited :)

And btw, they are scheduled for delivery late next week, so I'm gonna be a real busy guy for the next few years doing upgrades for all my previous clients !




j/k.....hahahaha, made ya look :D
8.5% chance your bots were hallucinating and bought some IBM mainframes because those looked big enough in the pics, and prices seemed about right.
Posted on Reply
#10
bonehead123
Wirko8.5% chance your bots were hallucinating and bought some IBM mainframes because those looked big enough in the pics, and prices seemed about right.
No, I put it at only 2.02379% chance, hahahaha :)

and FYI...mainframes are just slameframes in disguise, IMO...
Posted on Reply
#11
watzupken
This is not surprising. Nvidia does not have infinite supply of AI hardware. So just fulfilling orders that are pent up because companies are waiting for Blackwell instead of existing H100 will give you the impression that the order is "insane" in the words of Jensen. It is all about building hype. Stock market is generally irrational and driven mostly by hype and investor's "gut feel".
Posted on Reply
#12
Space Lynx
Astronaut
Is this the same node/factories as the Blackwell gaming gpu? or are those different factories/node? If not, then RIP gamers trying to get one. I'm not in the market, happy with my 7900 XT, but yeah, damn.
Posted on Reply
#13
hsew
DavenThis leaves AMD Instinct for the littler guys to develop the next generation of AI and GPU compute. No one is going to stop developing and buying just because Nvidia is sold out.
Right now it looks like the next race is going to be building L-Mul accelerators.
Posted on Reply
#14
Dr. Dro
Space LynxIs this the same node/factories as the Blackwell gaming gpu? or are those different factories/node? If not, then RIP gamers trying to get one. I'm not in the market, happy with my 7900 XT, but yeah, damn.
Yes it's the same node. However this shouldn't have much of an impact on the gaming business. They don't use the same processor, and TSMC is pumping out wafers at maximum capacity already.

Supply is entirely constrained by TSMC production capacity nowadays. They have a de facto monopoly on latest generation semiconductor fabrication
Posted on Reply
#15
Space Lynx
Astronaut
Dr. DroYes it's the same node. However this shouldn't have much of an impact on the gaming business. They don't use the same processor, and TSMC is pumping out wafers at maximum capacity already.

Supply is entirely constrained by TSMC production capacity nowadays. They have a de facto monopoly on latest generation semiconductor fabrication
I just don't see why the factory would spend any time at all making 5090's when they could be pumping out the AI chips, maybe the gaming market is just so small at the 5090 level of price, they can do like less than 10% of factory time for a year to meet most demand? No idea. Or maybe some AI chips are bad, and they can re-formulate it into a 5080 or 5090. Sort of like my 7900 XT was probably just a failed XTX silicon?
Posted on Reply
#16
Dr. Dro
Space LynxI just don't see why the factory would spend any time at all making 5090's when they could be pumping out the AI chips, maybe the gaming market is just so small at the 5090 level of price, they can do like less than 10% of factory time for a year to meet most demand? No idea. Or maybe some AI chips are bad, and they can re-formulate it into a 5080 or 5090. Sort of like my 7900 XT was probably just a failed XTX silicon?
Margins on 5090 are quite high, with a volume that will far exceed enterprise deals. I'd wager they'd scale down on the more "accessible" chips instead, and make the bulk of the shipment 5090's and a few 5080's. 5090 being 13% disabled certainly means it's "failed" GB202 silicon, which will likely be seen in full in enterprise RTX (same as 4090 and 6000 Ada Generation)
Posted on Reply
#17
Solaris17
Super Dainty Moderator
Dr. DroIt's possible that H100s are now being retired due to much faster performing hardware being available.
yes. these systems will get relegated to training or smaller clusters. Most of the top end GPU buys are for inference. Which is what users see, when they ask chatGPT something, or generate a cat meme, or have there video scrubbed etc. The real time stuff that leaves the user waiting is inference and that is what these companies are replacing.

They will keep doing the "buy new" with every generation. The old GPUs still exist they arent going to land fills, they just keep them in the same rack and repurpose the workload. Training can happen a little slower so when a system is no longer meeting the performance metrics required for inference they shift the load and have the old ones train the models instead.

This happens industry wide; otherwise you would have just as many trucks leaving with equipment as going in with new. Which isnt the case; thats why all of them are expanding there DC presence. Square footage is the real race. Since there needs to be both training and inference they arent getting rid of the older systems, they arent even moving them, they just open a new data hall.

You wont see actual sunsetting until 3-5 years after release for these GPUs. H100 is certainly still being used.
Posted on Reply
#18
igormp
Solaris17yes. these systems will get relegated to training or smaller clusters. Most of the top end GPU buys are for inference. Which is what users see, when they ask chatGPT something, or generate a cat meme, or have there video scrubbed etc. The real time stuff that leaves the user waiting is inference and that is what these companies are replacing.

They will keep doing the "buy new" with every generation. The old GPUs still exist they arent going to land fills, they just keep them in the same rack and repurpose the workload. Training can happen a little slower so when a system is no longer meeting the performance metrics required for inference they shift the load and have the old ones train the models instead.

This happens industry wide; otherwise you would have just as many trucks leaving with equipment as going in with new. Which isnt the case; thats why all of them are expanding there DC presence. Square footage is the real race. Since there needs to be both training and inference they arent getting rid of the older systems, they arent even moving them, they just open a new data hall.

You wont see actual sunsetting until 3-5 years after release for these GPUs. H100 is certainly still being used.
I miss when it took only 1~2 gens for used products to flood the market for cheap. When the A100 80GB came out you could get V100s for really cheap, but with the AI craze you can barely see even A100s, and those are hella expensive still :(
Posted on Reply
#19
umeng2002
Is anyone making money on AI yet? I mean other than nVidia™ and TSMC.
Posted on Reply
#20
R0H1T
Samsung's supposed to charge for parts of their AI suite after the end of next year.
Posted on Reply
#21
Dr. Dro
igormpI miss when it took only 1~2 gens for used products to flood the market for cheap. When the A100 80GB came out you could get V100s for really cheap, but with the AI craze you can barely see even A100s, and those are hella expensive still :(
Even going as far back as GV100 is still way out of reach for the hobbyist, I've wanted a Titan V for so long and just now they hit the $400 mark out there. Here in Brazil then, good luck...
umeng2002Is anyone making money on AI yet? I mean other than nVidia™ and TSMC.
Oh yes. It's that big an industry.
Posted on Reply
Add your own comment
Oct 13th, 2024 08:17 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts