Tuesday, June 11th 2024

Possible Specs of NVIDIA GeForce "Blackwell" GPU Lineup Leaked

Possible specifications of the various NVIDIA GeForce "Blackwell" gaming GPUs were leaked to the web by Kopite7kimi, a reliable source with NVIDIA leaks. These are specs of the maxed out silicon, NVIDIA will carve out several GeForce RTX 50-series SKUs based on these chips, which could end up with lower shader counts than those shown here. We've known from older reports that there will be five chips in all, the GB202 being the largest, followed by the GB203, the GB205, the GB206, and the GB207. There is a notable absence of a successor to the AD104, GA104, and TU104, because NVIDIA is trying a slightly different way to approach the performance segment with this generation.

The GB202 is the halo segment chip that will drive the possible RTX 5090 (RTX 4090 successor). This chip is endowed with 192 streaming multiprocessors (SM), or 96 texture processing clusters (TPCs). These 96 TPCs are spread across 12 graphics processing clusters (GPCs), which each have 8 of them. Assuming that "Blackwell" has the same 256 CUDA cores per TPC that the past several generations of NVIDIA gaming GPUs have had, we end up with a total CUDA core count of 24,576. Another interesting aspect about this mega-chip is memory. The GPU implements the next-generation GDDR7 memory, and uses a mammoth 512-bit memory bus. Assuming the 28 Gbps memory speed that was being rumored for NVIDIA's "Blackwell" generation, this chip has 1,792 GB/s of memory bandwidth on tap!
The GB203 is the next chip in the series, and poised to be a successor in name to the current AD103. It generationally reduces the shader counts, counting on the architecture and clock speeds to more than come through for performance; while retaining the 256-bit bus width of the AD103. The net result could be a significantly smaller GPU than the AD103, for better performance. The GB203 is endowed with 10,752 CUDA cores, spread across 84 SM (42 TPCs). The chip has 7 GPCs, each with 6 TPCs. The memory bus, as we mentioned, is 256-bit, and at a memory speed of 28 Gbps, would yield 896 GB/s of bandwidth.

The GB205 will power the lower half of the performance segment in the GeForce "Blackwell" generation. This chip has a rather surprising CUDA core count of just 6,400, spread across 50 SM, which are arranged in 5 GPCs of 5 TPCs, each. The memory bus width is 192-bit. For 28 Gbps, this would result in 672 GB/s of memory bandwidth.

The GB206 drives the mid-range of the series. This chip gets very close to matching the CUDA core count of the GB205, with 6,144 of them. These are spread across 36 SM (18 TPCs). The 18 TPCs span 3 GPCs of 6 TPCs, each. The key differentiator between the GB205 and GB206 is memory bus width, which is narrowed to 128-bit for the GB206. With the same 28 Gbps memory speed being used here, such a chip would end up with 448 GB/s of memory bandwidth.

At the entry level, there is the GB207, a significantly smaller chip with just 2,560 CUDA cores, across 10 SM, spanning two GPCs of 5 TPCs, each. The memory bus width is unchanged at 128-bit, but the memory type used is the older generation GDDR6. Assuming NVIDIA uses 18 Gbps memory speeds, it ends up with 288 GB/s on tap.

NVIDIA is expected to double down on large on-die caches on all its chips, to cushion the memory sub-systems. We expect there to be several other innovations in the areas of ray tracing performance, AI acceleration, and certain other features exclusive to the architecture. The company is expected to debut the series some time in Q4-2024.
Source: kopite7kimi (Twitter)
Add your own comment

141 Comments on Possible Specs of NVIDIA GeForce "Blackwell" GPU Lineup Leaked

#1
Assimilator
I really wish NVIDIA had decided to increase the VRAM capacity and bus width over Ada. Not because more VRAM and a wider bus actually does anything for performance, but because it would at least stop Radeon fanboys crying about how NVIDIA is screwing buyers over. News flash, the 88% of people who own an NVIDIA GPU only feel screwed over by AMD's inability to compete.
Posted on Reply
#2
Gigaherz
Yo wheres my 384bit still barely payable pro-gpu ?
Posted on Reply
#3
Tomgang
It´s rumors and we don´t know what of these gpu´s will power eacth model. We neither don´t know spec like clock speed og cuda cores, how many rt cores and so on.

But unless RTX 5080 uses a cut down version of GB202, it dosent seems RTX 5080 will be that beastly. That is of cause based on that it will be made with the GB203 gpu.

With cut down GB202 gpu i see however a potentiel beastly gpu that can or will beat rtx 4090. With GB203 i am not so sure. That will depend on IPC, clock speed and memory clock as well.

We can only guess. RTX 5090 looks to be a beast, but if other rumors are true, rtx 5090 can end up more exspensive than RTX 4090 as well.

Well no matter what, i dont exspect to upgrade to blackwell. It´s going to be a hell of an exspensive upgrade for me. Not only just for GPU, cause i need at least RTX 5080 for any meanful upgrade over RTX 4090. But also because i then would need a new CPU, do to my 5950X will defently cpu bottleneck a high-end blackwell gpu. So besides new gpu, i would need new cpu, motherboard, cpu cooler and memory. So no i think i shall just keep my curent setup with 5950X and RTX 4090 and be happy. Besides this combo is not so bad either yet.
Posted on Reply
#4
Ravenmaster
Where's the 384-bit model with 24GB GDDR7 though? Seems like a big gap between the top model and the next one down
Posted on Reply
#5
Assimilator
TomgangIt´s rumors and we don´t know what of these gpu´s will power eacth model. We neither don´t know spec like clock speed og cuda cores, how many rt cores and so on.

But unless RTX 5080 uses a cut down version of GB202, it dosent seems RTX 5080 will be that beastly. That is of cause based on that it will be made with the GB203 gpu.
It doesn't have to be beastly, it just has to be faster than the previous generation's x080, because NVIDIA is competing with itself.
TomgangWith cut down GB202 gpu i see however a potentiel beastly gpu that can or will beat rtx 4090. With GB203 i am not so sure. That will depend on IPC, clock speed and memory clock as well.

We can only guess. RTX 5090 looks to be a beast, but if other rumors are true, rtx 5090 can end up more exspensive than RTX 4090 as well.
It'll absolutely be more expensive than the 4090 - a 512-bit bus and 32+ GB of the newest memory type is not going to be cheap in any way shape or form.
RavenmasterWhere's the 384-bit model with 24GB GDDR7 though? Seems like a big gap between the top model and the next one down
The gap is intentionally large so that NVIDIA can charge an equally large price premium.
Posted on Reply
#6
64K
Still the 5080 is screaming disappointment and probably once again the Nvidia stack will be overpriced just like Ada.
Posted on Reply
#7
Dristun
They're really making 80 look even worse compared to 90 than with Ada, huh. Two thousand buckarinoos at least for the 5090, I gather?
Posted on Reply
#8
Carillon
RavenmasterWhere's the 384-bit model with 24GB GDDR7 though? Seems like a big gap between the top model and the next one down
It will probably come from a cut down of the big die
Posted on Reply
#9
Tomgang
AssimilatorIt doesn't have to be beastly, it just has to be faster than the previous generation's x080, because NVIDIA is competing with itself.


It'll absolutely be more expensive than the 4090 - a 512-bit bus and 32+ GB of the newest memory type is not going to be cheap in any way shape or form.


The gap is intentionally large so that NVIDIA can charge an equally large price premium.
Sure it just have to be faster tecnically. But if only barely faster than RTX 4080 super, less people will be upgrading. Specially if price as well is bad. I can see it for me with Ngreedia.

Im afraid 5090 will indeed be more exspensive. 2000 USD or around there is my guess.
DristunThey're really making 80 look even worse compared to 90 than with Ada, huh. Two thousand buckarinoos at least for the 5090, I gather?
Yeah i agree, that is also around there i would exspect the price to be. Specially if amd next gen is no match for Blackwell.
Posted on Reply
#10
dgianstefani
TPU Proofreader
5080 Ti/5090 here I come.

3080 Ti has been great, but it's time for an upgrade.

Hoping for around ~40% better performance than Ada, more is great of course.

3080 Ti to 5090/5080 Ti ideally around 2x faster.

Since I framecap to 237 FPS, faster/more efficient cards also means a lower total wattage which is always nice, unless new games/software push the GPU that much harder, which I doubt. Ada was a significant efficiency leap and very tempting, but I don't upgrade every gen of GPU.
Posted on Reply
#11
LazyGamer
My cat says:
"The meow you buy, the meow you save."
Posted on Reply
#12
JWNoctis
Gonna be finally something that could drive CP2077 at native 4K with all the bells and whistles and DLAA, at 100+ FPS? Strange there wasn't as much of a "But can it run CP2077" snowclone. :p
Posted on Reply
#13
Vayra86
AssimilatorI really wish NVIDIA had decided to increase the VRAM capacity and bus width over Ada. Not because more VRAM and a wider bus actually does anything for performance, but because it would at least stop Radeon fanboys crying about how NVIDIA is screwing buyers over. News flash, the 88% of people who own an NVIDIA GPU only feel screwed over by AMD's inability to compete.
Yeah! More stagnation! Let's vote for stagnation!

I don't know what you're looking at, but I'm seeing a slight uptick per tier, with all things increased except the bus width. GDDR7 makes up for part of the deficit though so bandwidth won't be worse than Ada relatively, that's good. But capacity, still 12GB in the midrange and 8GB bottom end? You're saying this is a good thing, now? Ada is already bandwidth constrained at the lower tiers. Nvidia is trying real hard to keep those tiers to what, 1080p gaming?

To each their own, but I think in 2025 people would like to move on from 1080p. The 8GB tier is by then bottomline useless and relies mostly on cache; the 12GB tier can't ever become a real performance tier midrange for long, its worse than the position Ada's 12GBs are in today in terms of longevity. Sure, they'll be fine today and on release. But they're useless by or around 2026, much like the current crop of Ada 12GBs.

As for AMD's inability to compete... RT fools & money were parted here. AMD keeps pace just fine in actual gaming and raster perf and is/has been on many occasions cheaper. They compete better than they have done in the past. Customers just buy Nvidia, and if that makes them feel 'screwed over'... yeah... a token of the snowflake generation, that also doesn't vote and then wonders why the world's going to shit.

You can't fix stupidity. Apparently people love to watch in apathy as things escalate into dystopia, spending money as they go and selling off their autonomy one purchase and subscription at a time.
Posted on Reply
#14
Dr. Dro
Sounds rather unlikely at the GB205 level, it'd imply a completely different topology than the other models. Also 512-bit seems unlikely on GB202, but that gulf would turn out even larger than the gulf between AD102 and AD103. This means an eventual GB203 RTX 5080 would be less than 50% as powerful as the 5090, and still beat practically every Ada card except the 4090, that's a scary thought!
Posted on Reply
#15
the54thvoid
Super Intoxicated Moderator
dgianstefani5080 Ti/5090 here I come.

3080 Ti has been great, but it's time for an upgrade.

Hoping for around ~40% better performance than Ada, more is great of course.

3080 Ti to 5090/5080 Ti ideally around 2x faster.

Since I framecap to 237 FPS, faster/more efficient cards also means a lower total wattage which is always nice, unless new games/software push the GPU that much harder, which I doubt. Ada was a significant efficiency leap and very tempting, but I don't upgrade every gen of GPU.
Two generation gap? For me the 2080ti to 4070ti was a 50% jump.

Settle for nothing less! :cool:
Posted on Reply
#16
Vayra86
Dr. DroSounds rather unlikely at the GB205 level, it'd imply a completely different topology than the other models. Also 512-bit seems unlikely on GB202, but that gulf would turn out even larger than the gulf between AD102 and AD103. This means an eventual GB203 RTX 5080 would be less than 50% as powerful as the 5090, and still beat practically every Ada card except the 4090, that's a scary thought!
Yeah I don't really feel the 512 bit, where is this 384 bit frankenstein cut they usually do? They won't miss that opportunity. The gap between 202 and 203 is 150% in cores. They will cut 202 up in several ways I reckon.
Posted on Reply
#17
64K
Vayra86Yeah I don't really feel the 512 bit, where is this 384 bit frankenstein cut they usually do? They won't miss that opportunity. The gap between 202 and 203 is 150% in cores. They will cut 202 up in several ways I reckon.
Probably a 5080 Ti when the 5080 flops but just a guess. Nvidia will try to push the 5080 even if it's a disappointment but they need to leave some maneuvering room if they fail.
Posted on Reply
#18
Chrispy_
Jesus, GB206 (presumably 5060 and 5060Ti) are still 128-bit and still potentially 8GB. Nvidia are expecting GDDR7 to do some heavy lifting there and they've already complained about Micron GDDR7 not living up to claims, whilst Micron themselves aren't shouting their success from the rooftops.

IMO, there's no reason for these three variants to exist at all. I am 100% certain Nvidia will screw everyone on VRAM anyway, as part of the upsell:


The 5090 is 50% more compute than a 4090, so it'll cost $3000. Nvidia will probably say it's 60% faster which is why they can justify the asking price, which is irrelevant because China will buy them all for AI anyway and demand will outstrip demand until something better than 5090s comes along for AI datacenter racks.
Posted on Reply
#19
Dristun
Vayra86Yeah I don't really feel the 512 bit, where is this 384 bit frankenstein cut they usually do? They won't miss that opportunity. The gap between 202 and 203 is 150% in cores. They will cut 202 up in several ways I reckon.
Earlier rumour said the 90 is gonna be with 28GB and comparably cut in cores but it also said that the 4080 is 24GB and carved out from GB202. Those rumour fellas know nothing, don't they? :D
Posted on Reply
#20
Vayra86
DristunEarlier rumour said the 90 is gonna be with 28GB and comparably cut in cores but it also said that the 4080 is 24GB and carved out from GB202. Those rumour fellas know nothing, don't they? :D
Yeah it doesn't take rocket science to extrapolate numbers to a new gen, really. Its not like Nvidia is full of surprises here, or anywhere else. By default, we're screwed.
Posted on Reply
#21
Chaitanya
AssimilatorThe gap is intentionally large so that NVIDIA can charge an equally large price premium.
It will be case of More you buy more you save with GB203 priced to make Gb202 more attractive.
Posted on Reply
#23
Sithaer
Curious to see how the 5060/Ti maybe a 5070 will end up.
I have no upgrade plans left for this year but sometime next year I wouldn't mind to upgrade my GPU and thats the highest I'm willing to go/what my budget allows. 'those will be plenty expensive enough where I live even second hand..:shadedshu:'
Posted on Reply
#24
Onyx Turbine
i do think that the overall technical composition (ddr 7 etc) will lead to a solid improvement over the 4000 series. What could be done better is step away
from 8gb vram at entry level but this probably is the going market size in terms of cost, 10 or at least 12 would be a fairer game.
The 5060ti or 5070 will probably cover my gaming needs for years to come, that is why i skipped the e.g. the 4060 ti.
SithaerCurious to see how the 5060/Ti maybe a 5070 will end up.
I have no upgrade plans left for this year but sometime next year I wouldn't mind to upgrade my GPU and thats the highest I'm willing to go/what my budget allows. 'those will be plenty expensive enough where I live even second hand..:shadedshu:'
definitely curious to those two cards..
Posted on Reply
#25
Daven
AssimilatorI really wish NVIDIA had decided to increase the VRAM capacity and bus width over Ada. Not because more VRAM and a wider bus actually does anything for performance, but because it would at least stop Radeon fanboys crying about how NVIDIA is screwing buyers over. News flash, the 88% of people who own an NVIDIA GPU only feel screwed over by AMD's inability to compete.
AMD inability to compete is because no one will buy their chips even though they are very competitive against Nvidia's offerings. Luckily, you Assimilator has just volunteered to buy AMD as your next graphics card to help drive down Nvidia prices. I will join you and together we will show everyone that the only way to bring about a competitive market is for everyone to stop buying brand and gimmicks and start buying great performance per dollar tech regardless of what name is on the box.
dgianstefani5080 Ti/5090 here I come.

3080 Ti has been great, but it's time for an upgrade.

Hoping for around ~40% better performance than Ada, more is great of course.

3080 Ti to 5090/5080 Ti ideally around 2x faster.

Since I framecap to 237 FPS, faster/more efficient cards also means a lower total wattage which is always nice, unless new games/software push the GPU that much harder, which I doubt. Ada was a significant efficiency leap and very tempting, but I don't upgrade every gen of GPU.
I'll either be buying a 9950X3D and a Radeon 8900XTX for my next build or skip a generate and get Zen 6 and RDNA5. Since AMD is best for gaming in my opinion and will continue to focus equally between gaming and AI, my dollars will continue to go to them until Nvidia stops wasting resources on RT and AI.
Posted on Reply
Add your own comment
Nov 21st, 2024 21:13 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts