Tuesday, June 11th 2024

Possible Specs of NVIDIA GeForce "Blackwell" GPU Lineup Leaked

Possible specifications of the various NVIDIA GeForce "Blackwell" gaming GPUs were leaked to the web by Kopite7kimi, a reliable source with NVIDIA leaks. These are specs of the maxed out silicon, NVIDIA will carve out several GeForce RTX 50-series SKUs based on these chips, which could end up with lower shader counts than those shown here. We've known from older reports that there will be five chips in all, the GB202 being the largest, followed by the GB203, the GB205, the GB206, and the GB207. There is a notable absence of a successor to the AD104, GA104, and TU104, because NVIDIA is trying a slightly different way to approach the performance segment with this generation.

The GB202 is the halo segment chip that will drive the possible RTX 5090 (RTX 4090 successor). This chip is endowed with 192 streaming multiprocessors (SM), or 96 texture processing clusters (TPCs). These 96 TPCs are spread across 12 graphics processing clusters (GPCs), which each have 8 of them. Assuming that "Blackwell" has the same 256 CUDA cores per TPC that the past several generations of NVIDIA gaming GPUs have had, we end up with a total CUDA core count of 24,576. Another interesting aspect about this mega-chip is memory. The GPU implements the next-generation GDDR7 memory, and uses a mammoth 512-bit memory bus. Assuming the 28 Gbps memory speed that was being rumored for NVIDIA's "Blackwell" generation, this chip has 1,792 GB/s of memory bandwidth on tap!
The GB203 is the next chip in the series, and poised to be a successor in name to the current AD103. It generationally reduces the shader counts, counting on the architecture and clock speeds to more than come through for performance; while retaining the 256-bit bus width of the AD103. The net result could be a significantly smaller GPU than the AD103, for better performance. The GB203 is endowed with 10,752 CUDA cores, spread across 84 SM (42 TPCs). The chip has 7 GPCs, each with 6 TPCs. The memory bus, as we mentioned, is 256-bit, and at a memory speed of 28 Gbps, would yield 896 GB/s of bandwidth.

The GB205 will power the lower half of the performance segment in the GeForce "Blackwell" generation. This chip has a rather surprising CUDA core count of just 6,400, spread across 50 SM, which are arranged in 5 GPCs of 5 TPCs, each. The memory bus width is 192-bit. For 28 Gbps, this would result in 672 GB/s of memory bandwidth.

The GB206 drives the mid-range of the series. This chip gets very close to matching the CUDA core count of the GB205, with 6,144 of them. These are spread across 36 SM (18 TPCs). The 18 TPCs span 3 GPCs of 6 TPCs, each. The key differentiator between the GB205 and GB206 is memory bus width, which is narrowed to 128-bit for the GB206. With the same 28 Gbps memory speed being used here, such a chip would end up with 448 GB/s of memory bandwidth.

At the entry level, there is the GB207, a significantly smaller chip with just 2,560 CUDA cores, across 10 SM, spanning two GPCs of 5 TPCs, each. The memory bus width is unchanged at 128-bit, but the memory type used is the older generation GDDR6. Assuming NVIDIA uses 18 Gbps memory speeds, it ends up with 288 GB/s on tap.

NVIDIA is expected to double down on large on-die caches on all its chips, to cushion the memory sub-systems. We expect there to be several other innovations in the areas of ray tracing performance, AI acceleration, and certain other features exclusive to the architecture. The company is expected to debut the series some time in Q4-2024.
Source: kopite7kimi (Twitter)
Add your own comment

141 Comments on Possible Specs of NVIDIA GeForce "Blackwell" GPU Lineup Leaked

#126
Nottoday
AssimilatorI really wish NVIDIA had decided to increase the VRAM capacity and bus width over Ada. Not because more VRAM and a wider bus actually does anything for performance, but because it would at least stop Radeon fanboys crying about how NVIDIA is screwing buyers over. News flash, the 88% of people who own an NVIDIA GPU only feel screwed over by AMD's inability to compete.
Leave it do the fanboy to pull out the term fanboy before anyone even comments and then start gulping down Jensen's meat like you get anything in return from it.

Really looking forward to another disappointing launch, since that has been the rule since 2018.
I really wonder, how will Nvidia surprise us? Entry level at 500€ with 92 bit bus and 65mm^2 die?
Posted on Reply
#127
Dr. Dro
NottodayLeave it do the fanboy to pull out the term fanboy before anyone even comments and then start gulping down Jensen's meat like you get anything in return from it.

Really looking forward to another disappointing launch, since that has been the rule since 2018.
I really wonder, how will Nvidia surprise us? Entry level at 500€ with 92 bit bus and 65mm^2 die?
If AMD does not step up, yes. But I have a feeling that things will be lively at the midrange with RDNA 4 and Battlemage
Posted on Reply
#128
AleXXX666
AssimilatorI really wish NVIDIA had decided to increase the VRAM capacity and bus width over Ada. Not because more VRAM and a wider bus actually does anything for performance, but because it would at least stop Radeon fanboys crying about how NVIDIA is screwing buyers over. News flash, the 88% of people who own an NVIDIA GPU only feel screwed over by AMD's inability to compete.
LMFAO, I'm Nvidia fanboy, and I feel shame of cr*p Nvidia offers with bus width. I'm OK with VRAM tho :D
Posted on Reply
#129
enb141
Durvelle27RT still isn’t viable as the performance hit it still to big without DLSS

DLSS is ok but so is FSR

And yea I hear that a lot. Which is funny because I’ve used AMD since the HD 4000 days and haven’t had driver issues since Hawaii. Which quite some time ago.
Yeah right, my crappy 6400 that replaced my 1050 had trash drivers that they never fixed their issues, in fact, the added more issues later on, and before you come here to tell me that I should report those issues, well I did it, and almost a year later that I reported those issues, they stayed there, so after hating that card for their crappy drivers, I ended up with a 4600, now my drivers problems are gone.

So for those of you that said that AMD has no drivers issues, they do, just check their forums on their reddit, nobody is listening either.
DavenNvidia brand loyalists are fixated on three things:
  • RT
  • DLSS
  • The internet myth that AMD has fundamental driver problems and Nvidia doesn't
Outside of those three things, the GPU market looks very even and competitive with AMD doing slightly better in performance and price as you pointed out. But even if all three of my points above didn't exist, these loyalists would still buy Nvidia. But I appreciate you and everyone else doing what they can to prevent the blind fealty to one company that threatens to ruin our DIY PC building market that we love so much.
Nvidia has less drivers issues than AMD by far, at least Nvidia tries to fix them, AMD seem to not care about it.
Posted on Reply
#130
THU31
I don't need 24 GB on a 5070, waste of power. I'll wait for the 18 GB Super refresh with 3 GB modules.

The cut down memory buses compared to Ampere make no sense. 60-class should be 192-bit and 70-class should be 256-bit. That's the main problem with the Ada line-up and it's gonna be the same here.
But I guess Jensen thinks more cache makes up for the lower capacity, not just bandwidth. It's not accurate, but it's correct.
Posted on Reply
#131
SJZL 2.0
Is nobody going to comment on how they incorrectly assumes the core count of GB206? If a TPC has 256 Cores and there are 18 TPCs, shouldn't it be 4608 Cores total? Or am I missing something here?
Posted on Reply
#132
starfals
AssimilatorI really wish NVIDIA had decided to increase the VRAM capacity and bus width over Ada. Not because more VRAM and a wider bus actually does anything for performance, but because it would at least stop Radeon fanboys crying about how NVIDIA is screwing buyers over. News flash, the 88% of people who own an NVIDIA GPU only feel screwed over by AMD's inability to compete.
Tell that to my 8GB brand new card that can't even start the last of us cus of low VRAM. Yeah, the last driver bugged the game, Nvidia already stated this much in their driver info. It used to work fine with 8GB, or well... after the patches... point is, having more VRAM is good. It's not empty crying. It can avoid having issues, even if it does nothing for performance. I still get nightmares from GTA 4 not having enough VRAM... Sometimes games need more. You are on Techpowerup, here you can CLEARLY see that fact from their reviews. Even for 1080p, some games need more. That's only 2024, imagine 2025? 2026? People dont buy a new GPU every year.

I too wish Nvidia would give us more too, not cus people cry, cus we NEEED it.
Posted on Reply
#133
TheinsanegamerN
starfalsTell that to my 8GB brand new card that can't even start the last of us cus of low VRAM. Yeah, the last driver bugged the game, Nvidia already stated this much in their driver info. It used to work fine with 8GB, or well... after the patches... point is, having more VRAM is good. It's not empty crying. It can avoid having issues, even if it does nothing for performance. I still get nightmares from GTA 4 not having enough VRAM... Sometimes games need more. You are on Techpowerup, here you can CLEARLY see that fact from their reviews. Even for 1080p, some games need more. That's only 2024, imagine 2025? 2026? People dont buy a new GPU every year.

I too wish Nvidia would give us more too, not cus people cry, cus we NEEED it.
There's still a lot of people out there that are really hung up on the 8GB thing, they think that games today should be just fine on 8GB because games 5 years ago ran fine on it and anything that doesn't is coded wrong by bad devs and nothing to do with the industry moving forward with newer tech.
Posted on Reply
#134
ARF
starfalsTell that to my 8GB brand new card that can't even start the last of us cus of low VRAM. Yeah, the last driver bugged the game, Nvidia already stated this much in their driver info. It used to work fine with 8GB, or well... after the patches... point is, having more VRAM is good. It's not empty crying. It can avoid having issues, even if it does nothing for performance. I still get nightmares from GTA 4 not having enough VRAM... Sometimes games need more. You are on Techpowerup, here you can CLEARLY see that fact from their reviews. Even for 1080p, some games need more. That's only 2024, imagine 2025? 2026? People dont buy a new GPU every year.

I too wish Nvidia would give us more too, not cus people cry, cus we NEEED it.
TheinsanegamerNThere's still a lot of people out there that are really hung up on the 8GB thing, they think that games today should be just fine on 8GB because games 5 years ago ran fine on it and anything that doesn't is coded wrong by bad devs and nothing to do with the industry moving forward with newer tech.
Limiting the VRAM to only 8 GB (first graphics card ever using this VRAM amount was the AMD Radeon R9 290X back in 2013, 11 years ago) effectively means that nvidia advocates lower resolutions screens, and works against improving the gamers' experience by going to 2160p.


www.thefpsreview.com/2023/05/03/hogwarts-legacy-cyberpunk-2077-and-the-last-of-us-part-i-top-list-of-vram-heavy-pc-titles/
Posted on Reply
#135
Prince Valiant
There's no good excuse for refusing to increase capacity.
Posted on Reply
#136
RadeonUser
I hope they are great cards, I have no interest in RT as it is mostly useless for my use case.

Happy 7900 XT user with no care in the world for things that are not even 25% matured or usable for most.

There will be a time things become mainstream, this time is not now, but I am not here to control who does what.

Some people RIP into console users and their experiences of fake resolutions whilst pushing their favoriate upscaler.

Cognitive dissonance and general intellect drop has been seen since the dawn of the RTX lineup.
enb141Yeah right, my crappy 6400 that replaced my 1050 had trash drivers that they never fixed their issues, in fact, the added more issues later on, and before you come here to tell me that I should report those issues, well I did it, and almost a year later that I reported those issues, they stayed there, so after hating that card for their crappy drivers, I ended up with a 4600, now my drivers problems are gone.

So for those of you that said that AMD has no drivers issues, they do, just check their forums on their reddit, nobody is listening either.



Nvidia has less drivers issues than AMD by far, at least Nvidia tries to fix them, AMD seem to not care about it.
Sorry your user experience is not good.

I click play, it works.

pcgamingwiki.com is your friend, a lot of issue have nothing to do with drivers, it's sometimes just Windows messing with stuff, it's why I run Enterprise 24H2 Windows 11.

Posted on Reply
#137
Tech Ninja
AssimilatorI really wish NVIDIA had decided to increase the VRAM capacity and bus width over Ada. Not because more VRAM and a wider bus actually does anything for performance, but because it would at least stop Radeon fanboys crying about how NVIDIA is screwing buyers over. News flash, the 88% of people who own an NVIDIA GPU only feel screwed over by AMD's inability to compete.
amen! It’s so bad we have to pray Intel will overtake AMD since AMD has abandoned the highend market. AMD has nothing capable of competing with 4090, 4080s, 4080, 4070ti S or even 3090/3080.
Posted on Reply
#138
64K
Tech Ninjaamen! It’s so bad we have to pray Intel will overtake AMD since AMD has abandoned the highend market. AMD has nothing capable of competing with 4090, 4080s, 4080, 4070ti S or even 3090/3080.
What do you mean? AMD has GPUs that compete with everything other than the 4090 but according to the Steam Hardware Survey less than 1% of gamers are using a 4090 anyway so is that really important to compete with?

If you are bringing ray tracing into it then that changes things though but maybe not for next gen AMD.
Posted on Reply
#139
dgianstefani
TPU Proofreader
ARFLimiting the VRAM to only 8 GB (first graphics card ever using this VRAM amount was the AMD Radeon R9 290X back in 2013, 11 years ago) effectively means that nvidia advocates lower resolutions screens, and works against improving the gamers' experience by going to 2160p.


www.thefpsreview.com/2023/05/03/hogwarts-legacy-cyberpunk-2077-and-the-last-of-us-part-i-top-list-of-vram-heavy-pc-titles/
xx60 users shouldn't expect to play at 4K Ultra with RT (without upscaling), using data implying that they can't due to VRAM limitations before complaining about the frame buffer and promoting 16 GB AMD options, which still lose to the 12 GB NVIDIA options at all resolutions in TPU testing, because they have more VRAM (despite the x600 class also having 8 GB BTW) is irrelevant to the use case and disingenuous. 1080p is by far the most popular resolution, and even 1440p is still quite playable with an 8 GB card, even in Ultra, considering TPU testing.
64KWhat do you mean? AMD has GPUs that compete with everything other than the 4090 but according to the Steam Hardware Survey less than 1% of gamers are using a 4090 anyway so is that really important to compete with?

If you are bringing ray tracing into it then that changes things though but maybe not for next gen AMD.
The first AMD discrete GPU in the Steam charts is the RX580 in position #31 with 0.91%, so, less than the 4090, I suppose that gives an idea how well consumers consider they compete. The NVIDIA xx60 class is of course #1.
Posted on Reply
#140
enb141
RadeonUserI hope they are great cards, I have no interest in RT as it is mostly useless for my use case.

Happy 7900 XT user with no care in the world for things that are not even 25% matured or usable for most.

There will be a time things become mainstream, this time is not now, but I am not here to control who does what.

Some people RIP into console users and their experiences of fake resolutions whilst pushing their favoriate upscaler.

Cognitive dissonance and general intellect drop has been seen since the dawn of the RTX lineup.


Sorry your user experience is not good.

I click play, it works.

pcgamingwiki.com is your friend, a lot of issue have nothing to do with drivers, it's sometimes just Windows messing with stuff, it's why I run Enterprise 24H2 Windows 11.

I'll check that Enterprise Windows 11, but trust me, AMD has drivers specifically AMD fault not windows or apps fault.
Posted on Reply
#141
DeathReborn
ARFLimiting the VRAM to only 8 GB (first graphics card ever using this VRAM amount was the AMD Radeon R9 290X back in 2013, 11 years ago) effectively means that nvidia advocates lower resolutions screens, and works against improving the gamers' experience by going to 2160p.


www.thefpsreview.com/2023/05/03/hogwarts-legacy-cyberpunk-2077-and-the-last-of-us-part-i-top-list-of-vram-heavy-pc-titles/
Slight correction on this, the 290X 8GB launched in November 2014, only the 4GB version launched in 2013. While the 390X was the first desktop GPU with 8GB, the first 8GB GPU was the 880M launched in March 2014, followed by the 980M in October 2014. Then in March 2015 Nvidia went to 12GB with the Titan X. You should also remember that AMD released 2GB 32bit GPU in 2022, it may have been OEM only but they still launched it, along with a 64bit 4GB Laptop salvage part in the 6400/6500XT.

I'm not saying 8GB today is good (and Nvidia certainly overprice it by $100 or more), but you have to admit both sides have their issues with the amount of VRAM they use BUT also slam the Memory IC makers who could have made 4GB chips for GDDR5X, GDDR6 & GDDR6X (plus 3GB for the latter 2) but instead stuck with 1GB or 2GB chips. With GDDR7 they have dropped the 1GB but 3GB is not coming until a year or more after the 2GB ones.

I can certainly see the 5050, 5060, 5060 Ti (maybe) being by far the worst choice compared to AMD & Battlemage in the sub $500 market, if only by the VRAM limiting their usefulness. I certainly wouldn't recommend a 8GB card for anything but eSports or casual light gaming and for under $200, preferably $150 or less.
Posted on Reply
Add your own comment
Nov 22nd, 2024 03:55 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts