Sunday, March 17th 2024

NVIDIA B100 "Blackwell" AI GPU Technical Details Leak Out

Jensen Huang's opening GTC 2024 keynote is scheduled to happen tomorrow afternoon (13:00 Pacific time)—many industry experts believe that the NVIDIA boss will take the stage and formally introduce his company's B100 "Blackwell" GPU architecture. An enlightened few have been treated to preview (AI and HPC) units—including Dell's CEO, Jeff Clarke—but pre-introduction leaks have not flowed out. Team Green is likely enforcing strict conditions upon a fortunate selection of trusted evaluators, within a pool of ecosystem partners and customers.

Today, a brave soul has broken that silence—tech tipster, AGF/XpeaGPU, fears repercussions from the leather-jacketed one. They revealed a handful of technical details, a day prior to Team Green's highly anticipated unveiling: "I don't want to spoil NVIDIA B100 launch tomorrow, but this thing is a monster. 2 dies on (TSMC) CoWoS-L, 8x8-Hi HBM3E stacks for 192 GB of memory." They also crystal balled an inevitable follow-up card: "one year later, B200 goes with 12-Hi stacks and will offer a beefy 288 GB. And the performance! It's... oh no Jensen is there... me run away!" Reuters has also joined in on the fun, with some predictions and insider information: "NVIDIA is unlikely to give specific pricing, but the B100 is likely to cost more than its predecessor, which sells for upwards of $20,000." Enterprise products are expected to arrive first—possibly later this year—followed by gaming variants, maybe months later.
Sources: AGF Tweet, VideoCardz, Reuters, Wccftech
Add your own comment

41 Comments on NVIDIA B100 "Blackwell" AI GPU Technical Details Leak Out

#26
FoulOnWhite
Because of the cost though, these are only an option for the mega corps or mega rich.
Posted on Reply
#27
john_
Antique4106I wonder what AMD's response to this will be.. Well, time to get ready for whatever the MI350X brings to the table, because this is definitely going to require something new for AMD to compete with it. The current MI300X is a bit better than the H200, if I remember correctly.
Considering they are already working with chiplets, they can do 1.5 times or even twice the MI300, plus 1.5 times or twice the HBM at, probably 1000-1200W and call it a day.
Posted on Reply
#28
Jism
FoulOnWhiteBecause of the cost though, these are only an option for the mega corps or mega rich.
You do know that, tech in today's formula 1 for example, will sooner or later reach consumer cars right?

Takes a couple of years.
Posted on Reply
#29
wolf
Better Than Native
ZoneDymoYou need to work on reading comprehension it seems, it's a joke upon the joke last line from the Twitter post....come on man.
The only time it's is written here on techpowerup, either in the news article itself or in the images on this article, is by the staff member, I don't follow links off this site to twitter, sorry not sorry, it may have proved wiser to include a screenshot of the post that said it if the context was to be apparent, nothing wrong with my reading comprehension.
Posted on Reply
#30
ZoneDymo
wolfThe only time it's is written here on techpowerup, either in the news article itself or in the images on this article, is by the staff member, I don't follow links off this site to twitter, sorry not sorry, it may have proved wiser to include a screenshot of the post that said it if the context was to be apparent, nothing wrong with my reading comprehension.
What....the image of the post is right there under the article...with an added visual joke from the poster to boot
Posted on Reply
#31
N/A
Why so scared of Jensen and running away D.Labelle style, who knows. This leak is clearly intentional and controlled. Not so detailed at all. Multi chip and hbme3e is well known by now. Predecessor H200 had 141MB already. So what's new, just pumping it a day before for more impact.
Posted on Reply
#32
wolf
Better Than Native
ZoneDymoWhat....the image of the post is right there under the article...with an added visual joke from the poster to boot
And where in that picture does it say leather-jacketed one?
Posted on Reply
#33
ZoneDymo
wolfAnd where in that picture does it say leather-jacketed one?
No that is the joke upon the joke from the Twitter post, idk what is going on here, are you offended that the common joke association of Jensen and his eternal same wardrobe of leather jackets exists?
Posted on Reply
#34
wolf
Better Than Native
ZoneDymoNo that is the joke upon the joke from the Twitter post, idk what is going on here, are you offended that the common joke association of Jensen and his eternal same wardrobe of leather jackets exists?
wolfFor a news post from a staff member, I find this content in the article itself to be in very poor taste.

I expect it from a considerable few in the user base, but not the staff, you guys can do better than that.
The context helps, and could/should have been included if it's a reference to a quote from the post that was not shown.
Posted on Reply
#35
ZoneDymo
wolfThe context helps, and could/should have been included if it's a reference to a quote from the post that was not shown.
I mean idk how much more reference you need, TPU reported news partly based on what the twitter post stated, added a hyperlink to the post in the article AND added a screenshow of the twitter post underneath...

Apart from that you did not answer the question, are you offended by the association joke about Jensen and his leather jackets, based on the fact that that is the only thing he ever wears (willingly)?
And that is the catalyst to the negative criticism of the reporting in this article?
Posted on Reply
#36
wolf
Better Than Native
ZoneDymoI mean
I've already made my thoughts on the whole situation perfectly clear in my previous replies, this is going nowhere. Please stop trying to argue your point.
Posted on Reply
#37
FoulOnWhite
Maybe jensen is a wannabe biker, or it's his midlife crysis jacket.
Posted on Reply
#38
Antique4106
john_Considering they are already working with chiplets, they can do 1.5 times or even twice the MI300, plus 1.5 times or twice the HBM at, probably 1000-1200W and call it a day.
Yes... funny how efficient using technology designed to be future-proof and scalable can be instead of just pouring countless billions into evolving a monolithic architecture, isn't it.
(This is a jab at Nvidia, not the user I'm replying to)
Posted on Reply
#39
Unregistered
the54thvoidI, for one, am very reassured that the leather-jacketed one works for Nvidia. It makes them cooler.



Ayyyyyyy!

Let us not forget American history and the church of the Fonz - this is the ONLY leather-jacketed one. Amen.
Technically, we have 4x Confirmed Jackets: Brando, James Dean, The Fonz and The Terminator (Cameron got us joint custody).
The UK however, has Rob Halford, and being a Metal God nets him at least 2-3x Jackets.

As for Jensen...no, he does not get to be in the All-Time Cool Guy Leather Jacket Club.

Posted on Edit | Reply
#40
AusWolf
Solid State Soul ( SSS )If i sip a drink every time i read the word Ai, i would die by the end of the day
Totally. All this massive push Nvidia is trying to give this hype train makes me think it's not going as fast as they'd want to.

By the way, is Blackwell not their new mainstream architecture? I'm confused now.
Posted on Reply
#41
SoppingClam
I wonder how many of the latest gen consoles in series would it require to match the performance of this GPU?

For pretty much every game I can get consistenly over 120fps. Also, I don't get screen tearing or any artifacts at 999+ fps in anything including such games as Rocket League at 4K with my rtx 4090. . But when unlocked it uses 99% gpu and about 450w.

So, put it down to the 240fps or at 120fps it uses 80watts.

That's the benefit of a 4090 over a 4080. While it has a higher tdp and potential performance it can also use a lot less power than over GPUs if you cap it.

Either way, it seems too soon to release a 5090 while the prices are still an extra zero too high, there are no new games that are coming our requiring a 4090 killer, consoles need a massive refresh or all new games will become more cartoony so consoles can have the 4k 100fps+ experience and there is no new GTA or Cyberpunk coming out this year.

However, while it has been a few years after the nvidia marbles demonstration was shown, if it is released to test and I don't get 120+ fps at 4k with my 4090 @ 2910-3000mhz (typical core clock) then I'd see the point.
Posted on Reply
Add your own comment
Nov 21st, 2024 08:42 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts