Sunday, March 17th 2024
NVIDIA B100 "Blackwell" AI GPU Technical Details Leak Out
Jensen Huang's opening GTC 2024 keynote is scheduled to happen tomorrow afternoon (13:00 Pacific time)—many industry experts believe that the NVIDIA boss will take the stage and formally introduce his company's B100 "Blackwell" GPU architecture. An enlightened few have been treated to preview (AI and HPC) units—including Dell's CEO, Jeff Clarke—but pre-introduction leaks have not flowed out. Team Green is likely enforcing strict conditions upon a fortunate selection of trusted evaluators, within a pool of ecosystem partners and customers.
Today, a brave soul has broken that silence—tech tipster, AGF/XpeaGPU, fears repercussions from the leather-jacketed one. They revealed a handful of technical details, a day prior to Team Green's highly anticipated unveiling: "I don't want to spoil NVIDIA B100 launch tomorrow, but this thing is a monster. 2 dies on (TSMC) CoWoS-L, 8x8-Hi HBM3E stacks for 192 GB of memory." They also crystal balled an inevitable follow-up card: "one year later, B200 goes with 12-Hi stacks and will offer a beefy 288 GB. And the performance! It's... oh no Jensen is there... me run away!" Reuters has also joined in on the fun, with some predictions and insider information: "NVIDIA is unlikely to give specific pricing, but the B100 is likely to cost more than its predecessor, which sells for upwards of $20,000." Enterprise products are expected to arrive first—possibly later this year—followed by gaming variants, maybe months later.
Sources:
AGF Tweet, VideoCardz, Reuters, Wccftech
Today, a brave soul has broken that silence—tech tipster, AGF/XpeaGPU, fears repercussions from the leather-jacketed one. They revealed a handful of technical details, a day prior to Team Green's highly anticipated unveiling: "I don't want to spoil NVIDIA B100 launch tomorrow, but this thing is a monster. 2 dies on (TSMC) CoWoS-L, 8x8-Hi HBM3E stacks for 192 GB of memory." They also crystal balled an inevitable follow-up card: "one year later, B200 goes with 12-Hi stacks and will offer a beefy 288 GB. And the performance! It's... oh no Jensen is there... me run away!" Reuters has also joined in on the fun, with some predictions and insider information: "NVIDIA is unlikely to give specific pricing, but the B100 is likely to cost more than its predecessor, which sells for upwards of $20,000." Enterprise products are expected to arrive first—possibly later this year—followed by gaming variants, maybe months later.
41 Comments on NVIDIA B100 "Blackwell" AI GPU Technical Details Leak Out
Dumb as fuck hype train. I think it has potential, but it's definitely not there yet. I think Nvidia stock is a huge huge bubble and its going to pop within 2 years when the hype train realizes there is no money being made from it. It will take time for the pop to happen though, as humans are subject to hype train nonsense.
I’ve got this feeling of deja vu. Like we’ve been in this situation before… hmm… oh wait, it was in the two previous crypto bubbles! After the first one NV even had to explain to their shareholders why the growth just suddenly stopped. And the only reason they have come out of the second one just fine is that we’ve essentially transitioned from crypto to AI now.
To be fair, compute is compute and there is always demand for more. So not like NV will just tank. But I fully expect the current feeding frenzy to subside. Especially when, inevitably, specialized hardware will replace GPGPUs for the task, as happened with mining.
As a software engineer, I'm not convinced by AI yet - whilst many colleagues of mine want in with absolutely no idea what they need it for. The world has a number of problems to solve, so why not get AI working on them? Chances are it will come up with something, although it's likely 'it' still has much to learn.
At the same time the article may have nothing to do with this at all.
I expect it from a considerable few in the user base, but not the staff, you guys can do better than that.
But oh well, Skynet will solve that part of the equation rather quickly :D
Is AI general intelligence close, if you know?
Yikes seems like a monster Nvid product again though.
The money is going to be in specialties, not this general AI going on right now. But you need to learn to walk before running :)
Ayyyyyyy!
Let us not forget American history and the church of the Fonz - this is the ONLY leather-jacketed one. Amen.
I would not want to give something to anyone using ChatGPT without any knowledge of the actual code.
I would pay for an AI program that will cook with me in real time and giving advice or explaining how to cook meals I have never thought of or done before.
General models is all we have right now, but soon specialty AI will be the final stop. It's to assist you, not replace you.
The trick right now is to tell chatGPT in small chunks what you want to code. It does this very well. You can't say "write me a game engine". But you can ask for some logic to be written C++ code.
It's just at the very early years - wait until you can have AI develop your next CPU/GPU without any human intervention even. A zillion of possibilities really.