Friday, February 23rd 2024

NVIDIA Expects Upcoming Blackwell GPU Generation to be Capacity-Constrained

NVIDIA is anticipating supply issues for its upcoming Blackwell GPUs, which are expected to significantly improve artificial intelligence compute performance. "We expect our next-generation products to be supply constrained as demand far exceeds supply," said Colette Kress, NVIDIA's chief financial officer, during a recent earnings call. This prediction of scarcity comes just days after an analyst noted much shorter lead times for NVIDIA's current flagship Hopper-based H100 GPUs tailored to AI and high-performance computing. The eagerly anticipated Blackwell architecture and B100 GPUs built on it promise major leaps in capability—likely spurring NVIDIA's existing customers to place pre-orders already. With skyrocketing demand in the red-hot AI compute market, NVIDIA appears poised to capitalize on the insatiable appetite for ever-greater processing power.

However, the scarcity of NVIDIA's products may present an excellent opportunity for significant rivals like AMD and Intel. If both companies can offer a product that could beat NVIDIA's current H100 and provide a suitable software stack, customers would be willing to jump to their offerings and not wait many months for the anticipated high lead times. Intel is preparing the next-generation Gaudi 3 and working on the Falcon Shores accelerator for AI and HPC. AMD is shipping its Instinct MI300 accelerator, a highly competitive product, while already working on the MI400 generation. It remains to be seen if AI companies will begin the adoption of non-NVIDIA hardware or if they will remain a loyal customer and agree to the higher lead times of the new Blackwell generation. However, capacity constrain should only be a problem at launch, where the availability should improve from quarter to quarter. As TSMC improves CoWoS packaging capacity and 3 nm production, NVIDIA's allocation of the 3 nm wafers will likely improve over time as the company moves its priority from H100 to B100.
Sources: Q4 Earning Call Transcript, via Tom's Hardware
Add your own comment

64 Comments on NVIDIA Expects Upcoming Blackwell GPU Generation to be Capacity-Constrained

#1
Space Lynx
Astronaut
just raise launch prices for the first round of sales until supply catches back up, meh im in no rush. got plenty of backlog, im in no rush for the ultima rig. but fear not lads, the ultima rig will be mine someday :)
Posted on Reply
#2
punani
Just give me my 400€ 4070 SUPER and I'll gladly wait for whatever comes next.
Posted on Reply
#3
kondamin
They should have the money now to support a couple of lines at different fabs.
allowing competition to chip away at their market dominance with other architectures is going to hurt more than less constrained markets and higher capex
punaniJust give me my 400€ 4070 SUPER and I'll gladly wait for whatever comes next.
4070ti super for €400
Posted on Reply
#4
Lionheart
Jensen - "What narrative are we using to justify our BS prices for newer GPU's?"

Nvidia Sales goon - "Supply constraint?"

Jensen - "Perfect!"
Posted on Reply
#5
hat
Enthusiast
It's a bit like insanity if you ask me. Why are we pushing so hard for smaller and smaller manufacturing processes with crap yields that come from a singular place? This current path only results in expensive, hot running, loud, watt guzzling equipment that there are too few to satisfy demand (allegedly). Maybe if we gave the process engineers a break we'd all be better off for it.
Posted on Reply
#6
katzi
LionheartJensen - "What narrative are we using to justify our BS prices for newer GPU's?"

Nvidia Sales goon - "Supply constraint?"

Jensen - "Perfect!"
^Literally this^
Posted on Reply
#7
londiste
hatIt's a bit like insanity if you ask me. Why are we pushing so hard for smaller and smaller manufacturing processes with crap yields that come from a singular place? This current path only results in expensive, hot running, loud, watt guzzling equipment that there are too few to satisfy demand (allegedly). Maybe if we gave the process engineers a break we'd all be better off for it.
What is the alternative? Perhaps less expensive but slower, even more hot running and more watt guzzling?
Posted on Reply
#8
ilyon
londisteWhat is the alternative? Perhaps less expensive but slower, even more hot running and more watt guzzling?
And without any trace of RTX™ and DLSS™, who cares ?
Posted on Reply
#9
usiname
hatIt's a bit like insanity if you ask me. Why are we pushing so hard for smaller and smaller manufacturing processes with crap yields that come from a singular place? This current path only results in expensive, hot running, loud, watt guzzling equipment that there are too few to satisfy demand (allegedly). Maybe if we gave the process engineers a break we'd all be better off for it.
If you use older manufacturing processes the new GPUs will be even slower and hotter. Also, slower new gens with minimal improvement over the previous gen will result in poor sales and less money for the leather jacket, so he need to make faster GPUs. Add the terrible performance of the new games and the BS how if you don't play with RTX and Path tracing, you don't play actually and you get the current situation where the leather jacket is trying desperately to make every game to run with 20 fps on 4090 at 4k and make the older GPUs to look slower and he wont stop. The situation will be same with 5090 as it is now with 4090, the bottom tiers 60,70 and 80 will be even worse than now, so everything point at only one think - money and more money.

The most stupid think is that the new games at low are still heavier than Watch Dogs 1 from 2014 at ultra and looks much worse. I am fine with Watch Dogs 1 ultra graphics but in 4-5 years I will need at least 4070 just to play with same level of graphics what I was used to play in 2014 with HD7870. Progress, just in back direction
Posted on Reply
#10
3valatzy
This is because it will be built on the 3 nm process.

Meanwhile, AMD's new RX 8700 XT (top RDNA 1.4 part) will stay on the older 4 nm process.
Prediction is double the raster performance, and triple the ray-tracing performance.

RTX 5090 will be 60% faster than RTX 4090, and launching between Q4 this year and Q2 next year depending on how fail AMD will be.
Posted on Reply
#11
iameatingjam
The goal was for my 4090 to last as long as my 1070 - 7 years. We'll see if thats possible. But I certainly wont be buying anything Blackwell.
Posted on Reply
#12
Quitessa
I could easily see nVidia starting off at Super Exorbitant prices, like 10k+ for a 90 class card just to see if people will buy it, and slowly drop the prices from there until they reach a point that sales start to meet supply and keep to that price.
Posted on Reply
#13
wolf
Performance Enthusiast
LionheartJensen - "What narrative are we using to justify our BS prices for newer GPU's?"

Nvidia Sales goon - "Supply constraint?"

Jensen - "Perfect!"
Lisa Su - "perfect, well just copy that pricing model too"
Posted on Reply
#14
Lionheart
wolfLisa Su - "perfect, well just copy that pricing model too"
Lol she wishes she could do that but AMD lacks the mindshare/feature set with their Radeon GPU's soo they gotta stay 10% - 15% lower when it comes to price to stay competitive/noticeable
Posted on Reply
#15
Bwaze
2020, RTX 3080 - $700
2022, RTX 4080 - $1200
2024, RTX 5080 - $2040 <- solution
2026, RTX 6080 - $3468
2028, RTX 7080 - $5896
2030, RTX 8080 - $10022
2032, RTX 9080 - $17038
2034, RTX 1080 - $28965
Posted on Reply
#16
The Shield
However, the scarcity of NVIDIA's products may present an excellent opportunity for significant rivals like AMD and Intel.
No, it doesn't. They will simply align with Nvidia out of scale prices, selling very very low quantities and keeping people willing to buy an Nvidia card, because of the identical price.
Posted on Reply
#17
Onasi
Of course it will be supply constrained, we are in a middle of another bubble. But even if it wasn’t and we weren’t, I don’t think anyone would expect prices to go down significantly on consumer hardware. NV is in a position where they can, pretty much, do whatever they want pricing wise.
Posted on Reply
#18
Dimitriman
LionheartJensen - "What narrative are we using to justify our BS prices for newer GPU's?"

Nvidia Sales goon - "Supply constraint?"

Jensen - "Perfect!"
Yes, the eternal Supply Constraint.

Jensen, we gamers are also suffering from supply constraint: supply of money in our pocket. So we will surely balance things out...
Posted on Reply
#19
iameatingjam
londisteWhat is the alternative? Perhaps less expensive but slower, even more hot running and more watt guzzling?
I think he's saying, better designs, new creative solutions, that kind of thing. Its gonna come down to that eventually anyway, can't keep shrinking transistors forever.
Posted on Reply
#20
Denver
Bwaze2020, RTX 3080 - $700
2022, RTX 4080 - $1200
2024, RTX 5080 - $2040 <- solution
2026, RTX 6080 - $3468
2028, RTX 7080 - $5896
2030, RTX 8080 - $10022
2032, RTX 9080 - $17038
2034, RTX 1080 - $28965
The prices seem realistic. But if they launch something for the gaming line( this year), it will likely be the RTX 5090 priced at $3000, and it will be available in very limited supply.
Posted on Reply
#21
Carillon
The title mentions GPUs, but the article is all about AI accelerators. Videocards are a thing of the past, we should move on and play x86 handhelds connected to a monitor with keyboard and mouse
Posted on Reply
#22
Vayra86
3valatzyThis is because it will be built on the 3 nm process.

Meanwhile, AMD's new RX 8700 XT (top RDNA 1.4 part) will stay on the older 4 nm process.
Prediction is double the raster performance, and triple the ray-tracing performance.

RTX 5090 will be 60% faster than RTX 4090, and launching between Q4 this year and Q2 next year depending on how fail AMD will be.
Please let me vape some out of that crystal ball you've got there. I love me a dose of wishful thinking
Posted on Reply
#23
londiste
iameatingjamI think he's saying, better designs, new creative solutions, that kind of thing. Its gonna come down to that eventually anyway, can't keep shrinking transistors forever.
You mean things like DLSS? :D
There is a whole bunch of smart people working on all that, regardless of how well the transistors shrink.
Posted on Reply
#24
Bwaze
CarillonThe title mentions GPUs, but the article is all about AI accelerators. Videocards are a thing of the past, we should move on and play x86 handhelds connected to a monitor with keyboard and mouse
Nvidia does that every time something comes up as even potential market. Remember Automotive, which never materialized? And during the cryptomadnesses Nvidia talked about strong Data and server sector, because they knew it's not really cool to brag about cryptomining.

Gaming, as large as it is, can't do explosive growth. Or any quick growth. And they know it will be there, whether they market it or not - as long as they provide reasonable product. But when they have better things to sell, things with higher margins, they know they can put Gaming on sidetracks for years.
Posted on Reply
#25
iameatingjam
londisteYou mean things like DLSS? :D
There is a whole bunch of smart people working on all that, regardless of how well the transistors shrink.
Thats not really what I meant, I was thinking, more efficient designs of the existing nodes. But I suppose software solutions like dlss fit into that.
Posted on Reply
Add your own comment
May 3rd, 2024 02:16 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts