Monday, September 18th 2023

NVIDIA Blackwell GB100 Die Could Use MCM Packaging

NVIDIA's upcoming Blackwell GPU architecture, expected to succeed the current Ada Lovelace architecture, is gearing up to make some significant changes. While we don't have any microarchitectural leaks, rumors are circulating that Blackwell will have different packaging and die structures. One of the most intriguing aspects of the upcoming Blackwell is the mention of a Multi-Chip Module (MCM) design for the GB100 data-center GPU. This advanced packaging approach allows different GPU components to exist on separate dies, providing NVIDIA with more flexibility in chip customization. This could mean that NVIDIA can more easily tailor its chips to meet the specific needs of various consumer and enterprise applications, potentially gaining a competitive edge against rivals like AMD.

While Blackwell's release is still a few years away, these early tidbits paint a picture of an architecture that isn't just an incremental improvement but could represent a more significant shift in how NVIDIA designs its GPUs. NVIDIA's potential competitor is AMD's upcoming MI300 GPU, which utilized chiplets in its designs. Chiplets also provide ease of integration as smaller dies provide better wafer yields, meaning that it makes more sense to switch to smaller dies and utilize chiplets economically.
Source: kopite7kimi
Add your own comment

11 Comments on NVIDIA Blackwell GB100 Die Could Use MCM Packaging

#1
bonehead123
AleksandarKisn't just an incremental improvement
yea, but you can say that about any change in performance that exceeds the paltry, miniscule 3-10% that they have been eeking out since way back :D
Posted on Reply
#2
Legacy-ZA
I used to get excited by GPU news, but then I took their rediculous price tags to the eyes.
Posted on Reply
#3
Lycanwolfen
Yep here it comes SLI single card. Hmm who else did that 23 years ago. 3dfx.
Posted on Reply
#4
HisDivineOrder
Who cares about cards that will cost $1999 if they even come to the consumer space at all? This generation Nvidia used something that should have just made gaming better in select cases (DLSS3 with frame generation) to give people less raster improvement per market tier at higher cost per tier.

I'm less interested in Nvidia than AMD's improvements for next gen handhelds because I know Nvidia won't stop until I'm priced out of desktop gaming and AMD isn't interested in competing at all. I guess there's Intel, but so far they don't seem interested in competing, either. Let Battlemage prove me wrong.
Posted on Reply
#5
R-T-B
LycanwolfenYep here it comes SLI single card. Hmm who else did that 23 years ago. 3dfx.
Not what this is conceptually at all.
Posted on Reply
#6
Minus Infinity
Hardly news at all. The maximum die size with current ASML tech is ~ 800mm^2 and next gen EUV will actually be smaller. Nvidia has no choice but to go MCM as it can't keep making giant monolithic dies for much longer.
Posted on Reply
#7
N/A
HisDivineOrderWho cares about cards that will cost $1999 if they even come to the consumer space at all?
This is the Hopper H100 successor and with 24 Rops hardly any better than a 4060.
The consumer GPUs are GB202, 203, 205 and such. And it will be worth it, worst case double the performance of a 4070 at double the price.
Posted on Reply
#8
LabRat 891
Correct me if I'm wrong but,
didn't NVLINK kinda precedent this?
IIRC, it's conceptually-similar to HyperTransport/InfinityFabric
Posted on Reply
#9
Dirt Chip
HisDivineOrderWho cares about cards that will cost $1999 if they even come to the consumer space at all? This generation Nvidia used something that should have just made gaming better in select cases (DLSS3 with frame generation) to give people less raster improvement per market tier at higher cost per tier.
We are getting 'trained' to pay for new features (RT, frame generation, DLSS\FSR improvement) and not for performance improvement. this gen GPU are mostly side-grade (and some down-grade) if not counting the new features. Gone are the days of better gen-over-gen value, only within-gen value remain.

I guess we will see more of this stuff and less raw pref improvement with next gen GPU`s.
We might even see a subscription monthly fee for 'premium features' that will upscale your fps using AI software and such. The more you (monthly) pay, the higher your fps will be.

Unless you enjoy this rat race, juts stick to your current build (mainly your GPU) until it break and compromise on setting as much as you need to get there.
Posted on Reply
#10
Vayra86
Dirt ChipWe might even see a subscription monthly fee for 'premium features' that will upscale your fps using AI software and such. The more you (monthly) pay, the higher your fps will be.
Might get?
Its already here.

Posted on Reply
#11
Dirt Chip
Vayra86Might get?
Its already here.

Yes, this is probably the foundation. This apply for game-by-demand and renting the GPU power online but still not what I`m talking.
We might see a similar thing but the GPU (say 4080) will be in your system after paying whatever it cost. Than, if you have a recent enough GPU, you will be entitled pay extra on a monthly basis to 'unleashes his true potential' by whatever AI magic they will invent. An extra pay to have the DLSS4-path tracing on or whatever.
We will get the first 12 month for free, so no need to warry.
Posted on Reply
Add your own comment
Dec 22nd, 2024 00:57 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts