Saturday, August 3rd 2024

Design Issues May Postpone Launch of NVIDIA's Advanced Blackwell AI Chips

NVIDIA may face delays in releasing its newest artificial intelligence chips due to design issues, according to anonymous sources involved in chip and server hardware production cited by The Information. The delay could extend to three months or more, potentially affecting major customers such as Meta, Google, and Microsoft. An unnamed Microsoft employee and another source claim that NVIDIA has already informed Microsoft about delays affecting the most advanced models in the Blackwell AI chip series. As a result, significant shipments are not expected until the first quarter of 2025.

When approached for comment, an NVIDIA spokesperson did not address communications with customers regarding the delay but stated that "production is on track to ramp" later this year. The Information reports that Microsoft, Google, Amazon Web Services, and Meta declined to comment on the matter, while Taiwan Semiconductor Manufacturing Company (TSMC) did not respond to inquiries.

Update 1:
The production issue was discovered by manufacturer TSMC, and involves the processor die that connects two Blackwell GPUs on a GB200." — via Data Center Dynamics
NVIDIA needs to redesign its chip, requiring a new TSMC production test before mass production. Rumors say they're considering a single-GPU version to expedite delivery. The delay leaves TSMC production lines idle temporarily.

Update 2:
SemiAnalysis's Dylan Patel reports in a message on Twitter (now known as X) that Blackwell supply will be considerably lower than anticipated in Q4 2024 and H1 2025. This shortage stems from TSMC's transition from CoWoS-S to CoWoS-L technology, required for NVIDIA's advanced Blackwell chips. Currently, TSMC's AP3 packaging facility is dedicated to CoWoS-S production, while initial CoWoS-L capacity is being installed in the AP6 facility.
Additionally, NVIDIA appears to be prioritizing production of GB200 NVL72 units over NVL36 units. The GB200 NVL36 configuration features 36 GPUs in a single rack with 18 individual GB200 compute nodes. In contrast, the NVL72 design incorporates 72 GPUs, either in a single rack with 18 double GB200 compute nodes or spread across two racks, each containing 18 single nodes.
Source: Bloomberg
Add your own comment

105 Comments on Design Issues May Postpone Launch of NVIDIA's Advanced Blackwell AI Chips

#76
64K
nguyenThere is nothing to feedback really, just look at RTX 4000 owners club, no one ever said anything there :roll:.

Weird to say that I beta testing something that works out of the box and had no problem whatsoever.
With what's going on with Intel CPUs, AMD delays and Nvidia delays on Blackwell it seems prudent to wait a bit after release for anything unsuspected to shake out. That's my plan anyway. The only card I ever bought at release was the GTX 970 and we saw how that turned out. The GPU was still great for the price and the benchmarks didn't change after the news broke but even so it would have been nice to know exactly what I was buying up front.
Posted on Reply
#77
nguyen
64KWith what's going on with Intel CPUs, AMD delays and Nvidia delays on Blackwell it seems prudent to wait a bit after release for anything unsuspected to shake out. That's my plan anyway. The only card I ever bought at release was the GTX 970 and we saw how that turned out. The GPU was still great for the price and the benchmarks didn't change after the news broke but even so it would have been nice to know exactly what I was buying up front.
Well I guess for experienced PC builder it would be no problem, I already undervolted, locked all-core frequency and capping TDP on my 13700KF since the day I had it, no point gaining a tiny fraction of FPS at the cost of heat/power/stability.

Hardware prices in my country never go down, so there is no point for me to wait really

I'm excited to see how 9950X3D vs Arrow Lake will turn out this October
Posted on Reply
#78
AusWolf
64KWith what's going on with Intel CPUs, AMD delays and Nvidia delays on Blackwell it seems prudent to wait a bit after release for anything unsuspected to shake out. That's my plan anyway. The only card I ever bought at release was the GTX 970 and we saw how that turned out. The GPU was still great for the price and the benchmarks didn't change after the news broke but even so it would have been nice to know exactly what I was buying up front.
That makes complete sense from any user's standpoint who just wants things to work. However, I don't think it's fair to criticise anyone who likes beta testing and/or experimenting with cutting-edge stuff, especially coming from someone who does the same with another brand, regardless of how stable their new releases tend to be.
Posted on Reply
#79
nguyen
AusWolfThat makes complete sense from any user's standpoint who just wants things to work. However, I don't think it's fair to criticise anyone who likes beta testing and/or experimenting with cutting-edge stuff, especially coming from someone who does the same with another brand, regardless of how stable their new releases tend to be.
There is nothing cutting edge about display drivers, also FYI the shader compilations sequences in new game take very long time, you would not want to install/reinstall beta drivers and then waste hours and hours for shader compilations.
Posted on Reply
#80
AusWolf
nguyenThere is nothing cutting edge about display drivers,
All the more reason why there's nothing wrong with beta testing as long as you know what you're signing up for.
nguyenalso FYI the shader compilations sequences in new game take very long time, you would not want to install/reinstall beta drivers and then waste hours and hours for shader compilations.
Hours? What are you talking about? The slowest game I've seen to compile shaders is Hogwarts Legacy, and it only took me a couple of minutes each time.
Posted on Reply
#81
HisDivineOrder
Nothing makes hungry people in a line hungrier than a long wait.
Posted on Reply
#82
Caring1
HisDivineOrderNothing makes hungry people in a line hungrier than a long wait.
If I'm in that line and I see and smell something better across the street, I'm going to cross it.
Posted on Reply
#83
AusWolf
HisDivineOrderNothing makes hungry people in a line hungrier than a long wait.
I wonder why people are so hungry in the first place. It's not like you can't game on current-gen graphics cards.
Posted on Reply
#84
64K
AusWolfI wonder why people are so hungry in the first place. It's not like you can't game on current-gen graphics cards.
GPUs are my weakness. I'm like a kid at Christmas time waiting for Christmas morning before the next generation drops. I get that way even if I know that I'm not going to get the new one but less so. Right now it will be months away but the last few days before reviews are the longest of all.
Posted on Reply
#85
AusWolf
64KGPUs are my weakness. I'm like a kid at Christmas time waiting for Christmas morning before the next generation drops. I get that way even if I know that I'm not going to get the new one but less so. Right now it will be months away but the last few days before reviews are the longest of all.
Good point - I like a good hardware review as well, even if I have zero intention of buying the product. :ohwell:
Posted on Reply
#86
Lycanwolfen
starfalsK, i refuse to buy this obsolete DisplayPort 2.1less generation... that doesn't even have enough power to max out the current games under 1000 dollars. So see yaa in 2025. Shame, i was ready to give them money now... cus playing laggy games aint fun.. but oh well. Back to Witcher 3. No more modern games for me till next year. Even 4080 is not enough in some cases, even less if its native without DLSS lol. The 4080 that was supposed to be 1000 bucks is actually 1300 still... and some high-end models even sell for 1600. God knows i ain't spending 1300-1600 so close to 2025 and the next gen stuff. Yes, im in Europe. Tax is one thing, but these cards cost a lot more. No comment on 4090. Even some models of 4070ti super are 1100!! The sooner this horrible generation is over the better.
Which modern game are you wanting to play exactly?
Posted on Reply
#87
Minus Infinity
nguyenWell I guess for experienced PC builder it would be no problem, I already undervolted, locked all-core frequency and capping TDP on my 13700KF since the day I had it, no point gaining a tiny fraction of FPS at the cost of heat/power/stability.

Hardware prices in my country never go down, so there is no point for me to wait really

I'm excited to see how 9950X3D vs Arrow Lake will turn out this October
Arrow Lake is coming December, so you'll be waiting until Xmas.
Posted on Reply
#88
Qwerty101
LycanwolfenWhich modern game are you wanting to play exactly?
PyTorch, TensorFlow, 400B+ LLMs :)
Posted on Reply
#89
stimpy88
Qwerty101PyTorch, TensorFlow, 400B+ LLMs :)
I agree with him, the 4080 is the absolute minimum for any RT or UE5 game running at 60fps or higher without DLSS and frame-rate trickery on a 1440p 240Hz or 4K 120+Hz monitor. And for some, high resolution, high refresh rate gaming is a thing. The 40x0 series being limited to only DisplayPort 1.4 is pretty bad considering there are now monitors that support full bandwidth DisplayPort 2.1 without compression, it's undeniable that nGreedia has held back the monitor manufacturers from offering faster, higher-resolution monitors. The media also had raised an eyebrow when nGreedia said the 40x0 series was DisplayPort 1.4 only.

Another point he makes is the cost. When you buy an $1200+ card, you expect to max the settings in the game. You shouldn't expect it to give you 35fps in return. The 4080 is too slow for the money, plus some people save for a long time to afford such a card and will expect it to be in their system for 3-5 years. It's an investment, and then to be told you $1200 1 year old card is not only slow in 2025 games, but that amazing new monitor you've been waiting for won't work with you card because it's only DP1.4...

Not everyone uses their computer the same as you or me, you have to respect that.
Posted on Reply
#90
nguyen
stimpy88I agree with him, the 4080 is the absolute minimum for any RT or UE5 game running at 60fps or higher without DLSS and frame-rate trickery on a 1440p 240Hz or 4K 120+Hz monitor. And for some, high resolution, high refresh rate gaming is a thing. The 40x0 series being limited to only DisplayPort 1.4 is pretty bad considering there are now monitors that support full bandwidth DisplayPort 2.1 without compression, it's undeniable that nGreedia has held back the monitor manufacturers from offering faster, higher-resolution monitors. The media also had raised an eyebrow when nGreedia said the 40x0 series was DisplayPort 1.4 only.

Another point he makes is the cost. When you buy an $1200+ card, you expect to max the settings in the game. You shouldn't expect it to give you 35fps in return. The 4080 is too slow for the money, plus some people save for a long time to afford such a card and will expect it to be in their system for 3-5 years. It's an investment, and then to be told you $1200 1 year old card is not only slow in 2025 games, but that amazing new monitor you've been waiting for won't work with you card because it's only DP1.4...

Not everyone uses their computer the same as you or me, you have to respect that.
Every GPU is slow, including 4090, when you have no idea about PC hardware

Kinda like saying Usain Bolt is slow for only running 10m/s, he should be running at 20m/s because some dude who have no idea about human limits thought so ;)
Posted on Reply
#91
stimpy88
nguyenEvery GPU is slow, including 4090, when you have no idea about PC hardware
Thats possibly the most stupid comment I might have ever heard in this forum.

We are obviously not like you, Mr Hardware expert, playing Pong on a 12" 320x180 CGA display at 10,000,000FPS with an overclocked 4090.
Posted on Reply
#92
AusWolf
stimpy88Another point he makes is the cost. When you buy an $1200+ card, you expect to max the settings in the game. You shouldn't expect it to give you 35fps in return. The 4080 is too slow for the money, plus some people save for a long time to afford such a card and will expect it to be in their system for 3-5 years. It's an investment, and then to be told you $1200 1 year old card is not only slow in 2025 games, but that amazing new monitor you've been waiting for won't work with you card because it's only DP1.4...
It's funny that you can either spend $1200+ and still struggle to play games at maxed out settings, or spend $300 and enjoy the same games at reduced settings.
nguyenEvery GPU is slow, including 4090, when you have no idea about PC hardware

Kinda like saying Usain Bolt is slow for only running 10m/s, he should be running at 20m/s because some dude who have no idea about human limits thought so ;)
No one said that the 4080 was slow. What was said was that the 4080 is too slow for the money it costs (reading comprehension?).
Posted on Reply
#93
nguyen
AusWolfIt's funny that you can either spend $1200+ and still struggle to play games at maxed out settings, or spend $300 and enjoy the same games at reduced settings.


No one said that the 4080 was slow. What was said was that the 4080 is too slow for the money it costs (reading comprehension?).
Using arbitrary standard to judge something is just plain stupid, should you compare it to the competitors first?
stimpy88Thats possibly the most stupid comment I might have ever heard in this forum.

We are obviously not like you, playing Pong on a 320x180 CGA display at 10,000,000FPS on a 4090.
Only second to your previous comment ;).
I'm playing all the lastest games at 4K144hz actually, gonna need alot more horsepower than 4090 that 5090 might just be barely enough
Posted on Reply
#94
stimpy88
nguyenI'm playing all the lastest games at 4K144hz actually, gonna need alot more horsepower than 4090 that 5090 might just be barely enough
BS and you know it. Or maybe you think the framerate is the same thing as the refresh rate of your monitor, Mr Hardware expert.
Posted on Reply
#95
AusWolf
nguyenUsing arbitrary metrics to judge something is just dumb, should you compare it to the competitors first?
Why would I compare to the competitor when both of them fail at delivering what I'd expect for such money?

Just to note, a GPU to me is 500 GBP max. If I'm paying more, I'm expecting A LOT more!
Posted on Reply
#96
nguyen
stimpy88BS and you know it.
Lol, funny I have a 4090 and you don't, so I know about 4090's capability more than you
Posted on Reply
#97
stimpy88
nguyenLol, funny I have a 4090 and you don't, so I know about 4090's capability more than you
I know you're lying. But you win the e-peen award for today. Such a big boy you are.
Posted on Reply
#98
nguyen
AusWolfWhy would I compare to the competitor when both of them fail at delivering what I'd expect for such money?

Just to note, a GPU to me is 500 GBP max. If I'm paying more, I'm expecting A LOT more!
Do you expect the world to give you luxury good for free too?
Posted on Reply
#99
stimpy88
nguyenDo you expect the world to give you luxury good for free too?
You're just a troll dude.
Posted on Reply
#100
AusWolf
nguyenDo you expect the world to give you luxury good for free too?
Who said GPUs were luxury goods? No, what I mean is, if a GPU that costs magnitudes more than what I consider a sensible price can't give me smooth gameplay with maxed out graphics, then what's the point?
Posted on Reply
Add your own comment
Nov 21st, 2024 06:36 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts