Friday, October 11th 2024

NVIDIA "Blackwell" GPUs are Sold Out for 12 Months, Customers Ordering in 100K GPU Quantities

NVIDIA's "Blackwell" series of GPUs, including B100, B200, and GB200, are reportedly sold out for 12 months or an entire year. This directly means that if a new customer is willing to order a new Blackwell GPU now, there is a 12-month waitlist to get that GPU. Analyst from Morgan Stanley Joe Moore confirmed that in a meeting with NVIDIA and its investors, NVIDIA executives confirmed that the demand for "Blackwell" is so great that there is a 12-month backlog to fulfill first before shipping to anyone else. We expect that this includes customers like Amazon, META, Microsoft, Google, Oracle, and others, who are ordering GPUs in insane quantities to keep up with the demand from their customers.

The previous generation of "Hopper" GPUs was ordered in 10s of thousands of GPUs, while this "Blackwell" generation was ordered in 100s of thousands of GPUs simultaneously. For NVIDIA, that is excellent news, as that demand is expected to continue. The only one standing in the way of customers is TSMC, which manufactures these GPUs as fast as possible to meet demand. NVIDIA is one of TSMC's largest customers, so wafer allocation at TSMC's facilities is only expected to grow. We are now officially in the era of the million-GPU data centers, and we can only question at what point this massive growth stops or if it will stop at all in the near future.
Source: via Barron's (Paywall)
Add your own comment

29 Comments on NVIDIA "Blackwell" GPUs are Sold Out for 12 Months, Customers Ordering in 100K GPU Quantities

#1
Daven
This leaves AMD Instinct for the littler guys to develop the next generation of AI and GPU compute. No one is going to stop developing and buying just because Nvidia is sold out.
Posted on Reply
#2
Bet0n
Meanwhile: www.latent.space/p/gpu-bubble
"

$2 H100s: How the GPU Bubble Burst

Last year, H100s were $8/hr if you could get them. Today, there's 7 different resale markets selling them under $2. What happened?"

Posted on Reply
#3
Dr. Dro
Bet0nMeanwhile: www.latent.space/p/gpu-bubble
"

$2 H100s: How the GPU Bubble Burst

Last year, H100s were $8/hr if you could get them. Today, there's 7 different resale markets selling them under $2. What happened?"

It's possible that H100s are now being retired due to much faster performing hardware being available. The Instinct MI300X is known to be better (as per Chips and Cheese testing), and the Blackwells are indubitably much faster as well.

While hardware like the H100 may seem surreal to most of us, there is only one industry that's as greedy and demanding of the absolute latest no matter the cost as the AI industry is, and that'd be the crypto industry, both of which are buying these by the crate since time is quite literally money.
Posted on Reply
#4
bonehead123
Wow, I'm soooo glad that I got my teeny, weenie, AI-buyer-bot-generated order of 10K units in before the rush from the big guys....so yee haw to me and boo hoo to all the suckas that waited :)

And btw, they are scheduled for delivery late next week, so I'm gonna be a real busy guy for the next few years doing upgrades for all my previous clients !




j/k.....hahahaha, made ya look :D
Posted on Reply
#5
bug
Good bye mining, hello AI...
Posted on Reply
#6
OSdevr
If the bubble bursts within the next year (quite likely) I wonder how many of these orders will be cancelled.
Posted on Reply
#7
64K
Looks like another scalper's paradise incoming.
Posted on Reply
#8
john_
Blackwell is two GPUs glued together, so it's like a 50% discount to Hopper, plus all the extra features. It makes sense to have much higher market success.
Posted on Reply
#9
Wirko
bonehead123Wow, I'm soooo glad that I got my teeny, weenie, AI-buyer-bot-generated order of 10K units in before the rush from the big guys....so yee haw to me and boo hoo to all the suckas that waited :)

And btw, they are scheduled for delivery late next week, so I'm gonna be a real busy guy for the next few years doing upgrades for all my previous clients !




j/k.....hahahaha, made ya look :D
8.5% chance your bots were hallucinating and bought some IBM mainframes because those looked big enough in the pics, and prices seemed about right.
Posted on Reply
#10
bonehead123
Wirko8.5% chance your bots were hallucinating and bought some IBM mainframes because those looked big enough in the pics, and prices seemed about right.
No, I put it at only 2.02379% chance, hahahaha :)

and FYI...mainframes are just slameframes in disguise, IMO...
Posted on Reply
#11
watzupken
This is not surprising. Nvidia does not have infinite supply of AI hardware. So just fulfilling orders that are pent up because companies are waiting for Blackwell instead of existing H100 will give you the impression that the order is "insane" in the words of Jensen. It is all about building hype. Stock market is generally irrational and driven mostly by hype and investor's "gut feel".
Posted on Reply
#12
Space Lynx
Astronaut
Is this the same node/factories as the Blackwell gaming gpu? or are those different factories/node? If not, then RIP gamers trying to get one. I'm not in the market, happy with my 7900 XT, but yeah, damn.
Posted on Reply
#13
hsew
DavenThis leaves AMD Instinct for the littler guys to develop the next generation of AI and GPU compute. No one is going to stop developing and buying just because Nvidia is sold out.
Right now it looks like the next race is going to be building L-Mul accelerators.
Posted on Reply
#14
Dr. Dro
Space LynxIs this the same node/factories as the Blackwell gaming gpu? or are those different factories/node? If not, then RIP gamers trying to get one. I'm not in the market, happy with my 7900 XT, but yeah, damn.
Yes it's the same node. However this shouldn't have much of an impact on the gaming business. They don't use the same processor, and TSMC is pumping out wafers at maximum capacity already.

Supply is entirely constrained by TSMC production capacity nowadays. They have a de facto monopoly on latest generation semiconductor fabrication
Posted on Reply
#15
Space Lynx
Astronaut
Dr. DroYes it's the same node. However this shouldn't have much of an impact on the gaming business. They don't use the same processor, and TSMC is pumping out wafers at maximum capacity already.

Supply is entirely constrained by TSMC production capacity nowadays. They have a de facto monopoly on latest generation semiconductor fabrication
I just don't see why the factory would spend any time at all making 5090's when they could be pumping out the AI chips, maybe the gaming market is just so small at the 5090 level of price, they can do like less than 10% of factory time for a year to meet most demand? No idea. Or maybe some AI chips are bad, and they can re-formulate it into a 5080 or 5090. Sort of like my 7900 XT was probably just a failed XTX silicon?
Posted on Reply
#16
Dr. Dro
Space LynxI just don't see why the factory would spend any time at all making 5090's when they could be pumping out the AI chips, maybe the gaming market is just so small at the 5090 level of price, they can do like less than 10% of factory time for a year to meet most demand? No idea. Or maybe some AI chips are bad, and they can re-formulate it into a 5080 or 5090. Sort of like my 7900 XT was probably just a failed XTX silicon?
Margins on 5090 are quite high, with a volume that will far exceed enterprise deals. I'd wager they'd scale down on the more "accessible" chips instead, and make the bulk of the shipment 5090's and a few 5080's. 5090 being 13% disabled certainly means it's "failed" GB202 silicon, which will likely be seen in full in enterprise RTX (same as 4090 and 6000 Ada Generation)
Posted on Reply
#17
Solaris17
Super Dainty Moderator
Dr. DroIt's possible that H100s are now being retired due to much faster performing hardware being available.
yes. these systems will get relegated to training or smaller clusters. Most of the top end GPU buys are for inference. Which is what users see, when they ask chatGPT something, or generate a cat meme, or have there video scrubbed etc. The real time stuff that leaves the user waiting is inference and that is what these companies are replacing.

They will keep doing the "buy new" with every generation. The old GPUs still exist they arent going to land fills, they just keep them in the same rack and repurpose the workload. Training can happen a little slower so when a system is no longer meeting the performance metrics required for inference they shift the load and have the old ones train the models instead.

This happens industry wide; otherwise you would have just as many trucks leaving with equipment as going in with new. Which isnt the case; thats why all of them are expanding there DC presence. Square footage is the real race. Since there needs to be both training and inference they arent getting rid of the older systems, they arent even moving them, they just open a new data hall.

You wont see actual sunsetting until 3-5 years after release for these GPUs. H100 is certainly still being used.
Posted on Reply
#18
igormp
Solaris17yes. these systems will get relegated to training or smaller clusters. Most of the top end GPU buys are for inference. Which is what users see, when they ask chatGPT something, or generate a cat meme, or have there video scrubbed etc. The real time stuff that leaves the user waiting is inference and that is what these companies are replacing.

They will keep doing the "buy new" with every generation. The old GPUs still exist they arent going to land fills, they just keep them in the same rack and repurpose the workload. Training can happen a little slower so when a system is no longer meeting the performance metrics required for inference they shift the load and have the old ones train the models instead.

This happens industry wide; otherwise you would have just as many trucks leaving with equipment as going in with new. Which isnt the case; thats why all of them are expanding there DC presence. Square footage is the real race. Since there needs to be both training and inference they arent getting rid of the older systems, they arent even moving them, they just open a new data hall.

You wont see actual sunsetting until 3-5 years after release for these GPUs. H100 is certainly still being used.
I miss when it took only 1~2 gens for used products to flood the market for cheap. When the A100 80GB came out you could get V100s for really cheap, but with the AI craze you can barely see even A100s, and those are hella expensive still :(
Posted on Reply
#19
umeng2002
Is anyone making money on AI yet? I mean other than nVidia™ and TSMC.
Posted on Reply
#20
R0H1T
Samsung's supposed to charge for parts of their AI suite after the end of next year.
Posted on Reply
#21
Dr. Dro
igormpI miss when it took only 1~2 gens for used products to flood the market for cheap. When the A100 80GB came out you could get V100s for really cheap, but with the AI craze you can barely see even A100s, and those are hella expensive still :(
Even going as far back as GV100 is still way out of reach for the hobbyist, I've wanted a Titan V for so long and just now they hit the $400 mark out there. Here in Brazil then, good luck...
umeng2002Is anyone making money on AI yet? I mean other than nVidia™ and TSMC.
Oh yes. It's that big an industry.
Posted on Reply
#22
igormp
Dr. DroEven going as far back as GV100 is still way out of reach for the hobbyist, I've wanted a Titan V for so long and just now they hit the $400 mark out there. Here in Brazil then, good luck...
Here in my uni we do have some Titan Vs that a prof managed to get from nvidia. I'm lucky to also have an A100 cluster available to use, I wonder if someday I'll be able to snatch one of those if they ever decide to replace them lol
Posted on Reply
#24
cvaldes
umeng2002Is anyone making money on AI yet? I mean other than nVidia™ and TSMC.
Yeah, Micron is raking in piles of money. You can't run NPUs without memory. Feel free to consult their SEC filings and quarterly earnings statements.

Remember that an AI accelerator is not just one chip all by itself. There's the PCB and a bunch of other things on the board. Then the board needs to plug into something. It won't get data magically so it needs connectivity infrastructure. There are companies who make racks, power supplies, cooling systems, etc.

And data needs to be stored somewhere so all of the companies involved in solid state data storage will be making money: SK Hynix, Marvell, Samsung, etc.

Of course all of the chip design & process companies are enjoying the AI boom. Cadence, ASML, Lam Research, Applied Materials, Zeiss, KLA, etc.

It's not just Nvidia and their manufacturing partner. Naturally AMD and Intel are also making money from AI, just not as much as Nvidia.

Of course, the power company is making money too. And since the world has more than one power company, you can use the words "power companies." Plural.

Nvidia most certainly is not doing this by themselves.

Remember that AI really isn't a product. There are consumer focused AI-powered tools but most AI right now is being used in enterprise settings. There are companies like FedEx and Walmart who are using AI to remove bottlenecks in typical situations. Even in chip design, AI is being used for chip layout. What took a year manually by humans can now be done in a month or less using AI-powered chip layout tools.

Lots of consumers online are fixated on consumer facing AI solutions but I assure you that AI chatbots are a tiny fraction of what AI is already doing. From a consumer perspective, remember that kids (teens and people in their 20s) have been using AI chatbots for a couple of years to do homework, write term papers, etc.
Posted on Reply
#25
InVasMani
Not too surprising with AI market booming and everyone trying to grab a piece of the pie. Like it or hate AI has garnered a ton of attention and hasn't exactly gotten worse, but rather improved making it even more desirable for people to leverage it's capabilities to assist them in whatever projects they feel like that AI is capable of helping with. I don't see this changing much anytime soon. There is too much money at play. It's kind of like like itunes is/was for Apple in terms of being a automatic cash cow or like Steam for Valve. For the big players in the market there is a lot of money to be generated off of the technology itself. It's like opening the flood gates to a lot of things that were mostly heavily restricted to the wealthy and well educated previously. It's no wonder people in some cushy positions making bank on the old ways of things aren't very keen on it. They didn't like Netflix either because it was disruptive especially the cable companies. They want their tightly held monopolies.
Posted on Reply
Add your own comment
Nov 21st, 2024 05:43 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts