Tuesday, February 27th 2024

NVIDIA AI GPU Customers Reportedly Selling Off Excess Hardware

The NVIDIA H100 Tensor Core GPU was last year's hot item for HPC and AI industry segments—the largest purchasers were reported to have acquired up to 150,000 units each. Demand grew so much that lead times of 36 to 52 weeks became the norm for H100-based server equipment. The latest rumblings indicate that things have stabilized—so much so that some organizations are "offloading chips" as the supply crunch cools off. Apparently it is more cost-effective to rent AI processing sessions through cloud service providers (CSPs)—the big three being Amazon Web Services, Google Cloud, and Microsoft Azure.

According to a mid-February Seeking Alpha report, wait times for the NVIDIA H100 80 GB GPU model have been reduced down to around three to four months. The Information believes that some companies have already reduced their order counts, while others have hardware sitting around, completely unused. Maintenance complexity and costs are reportedly cited as a main factors in "offloading" unneeded equipment, and turning to renting server time from CSPs. Despite improved supply conditions, AI GPU demand is still growing—driven mainly by organizations dealing with LLM models. A prime example being Open AI—as pointed out by The Information—insider murmurings have Sam Altman & Co. seeking out alternative solutions and production avenues.
Sources: The Information, Tom's Hardware
Add your own comment

9 Comments on NVIDIA AI GPU Customers Reportedly Selling Off Excess Hardware

#1
Count von Schwalbe
T0@stlead times of 36 to 52 weeks
Is that supposed to be bad? A relatively simple 1200A breaker panel takes at least that long to get, and you probably won't get a utility to get power to you in longer than that.
Posted on Reply
#2
T0@st
News Editor
Count von SchwalbeIs that supposed to be bad? A relatively simple 1200A breaker panel takes at least that long to get, and you probably won't get a utility to get power to you in longer than that.
It's a matter of perspective. One of the Microsoft enterprise guys indicated that he was (figuratively) tearing his hair out over AI GPU supply shortages in 2023.
Posted on Reply
#3
Count von Schwalbe
T0@stIt's a matter of perspective. One of the Microsoft enterprise guys indicated that he was (figuratively) tearing his hair out over AI GPU supply shortages in 2023.
I work in construction - I would have expected people buying these would have them on order or alt least reserved before building a data center, and the building would take much longer.
Posted on Reply
#4
Unregistered
Count von Schwalbework in construction - I would have expected people buying these would have them on order or alt least reserved before building a data center, and the building would take much longer.
Welcome to the wonderful world of AI, where there are no rules and everything's done by the seat of your pants.
Posted on Edit | Reply
#5
qlum
Typical hording behavior AI accelerators are becomming scarse, businesses orderering agressively in fear it would become harder to do if they don't and in the end they don't need as much.
Posted on Reply
#6
Eternit
So some nVidia customers already saturated their needs, yet they will be receiving backlogged orders for a few months and selling them off. This will quickly saturate other customers needs and there will be growing pile of not needed GPUs. Then there will be a few new TSMC and Intel foundries. Two years from now GPUs will have normal prices.
Posted on Reply
#7
ThrashZone
Hi,
AI told them to hehe
Who is this dude trying to fool
He looked like Balmer 5 years before this :laugh:
Posted on Reply
#8
kondamin
More of this please, it would be very nice if that bubble popped, and the cloud bubble while we are at it.
Posted on Reply
#9
piloponth
Classic hype curve. It looks like we are past the peak and big fall is upon us. That is great, inflated prices of graphic cards comes to an end.


Posted on Reply
Nov 21st, 2024 12:14 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts