Tuesday, June 11th 2024

Possible Specs of NVIDIA GeForce "Blackwell" GPU Lineup Leaked

Possible specifications of the various NVIDIA GeForce "Blackwell" gaming GPUs were leaked to the web by Kopite7kimi, a reliable source with NVIDIA leaks. These are specs of the maxed out silicon, NVIDIA will carve out several GeForce RTX 50-series SKUs based on these chips, which could end up with lower shader counts than those shown here. We've known from older reports that there will be five chips in all, the GB202 being the largest, followed by the GB203, the GB205, the GB206, and the GB207. There is a notable absence of a successor to the AD104, GA104, and TU104, because NVIDIA is trying a slightly different way to approach the performance segment with this generation.

The GB202 is the halo segment chip that will drive the possible RTX 5090 (RTX 4090 successor). This chip is endowed with 192 streaming multiprocessors (SM), or 96 texture processing clusters (TPCs). These 96 TPCs are spread across 12 graphics processing clusters (GPCs), which each have 8 of them. Assuming that "Blackwell" has the same 256 CUDA cores per TPC that the past several generations of NVIDIA gaming GPUs have had, we end up with a total CUDA core count of 24,576. Another interesting aspect about this mega-chip is memory. The GPU implements the next-generation GDDR7 memory, and uses a mammoth 512-bit memory bus. Assuming the 28 Gbps memory speed that was being rumored for NVIDIA's "Blackwell" generation, this chip has 1,792 GB/s of memory bandwidth on tap!
The GB203 is the next chip in the series, and poised to be a successor in name to the current AD103. It generationally reduces the shader counts, counting on the architecture and clock speeds to more than come through for performance; while retaining the 256-bit bus width of the AD103. The net result could be a significantly smaller GPU than the AD103, for better performance. The GB203 is endowed with 10,752 CUDA cores, spread across 84 SM (42 TPCs). The chip has 7 GPCs, each with 6 TPCs. The memory bus, as we mentioned, is 256-bit, and at a memory speed of 28 Gbps, would yield 896 GB/s of bandwidth.

The GB205 will power the lower half of the performance segment in the GeForce "Blackwell" generation. This chip has a rather surprising CUDA core count of just 6,400, spread across 50 SM, which are arranged in 5 GPCs of 5 TPCs, each. The memory bus width is 192-bit. For 28 Gbps, this would result in 672 GB/s of memory bandwidth.

The GB206 drives the mid-range of the series. This chip gets very close to matching the CUDA core count of the GB205, with 6,144 of them. These are spread across 36 SM (18 TPCs). The 18 TPCs span 3 GPCs of 6 TPCs, each. The key differentiator between the GB205 and GB206 is memory bus width, which is narrowed to 128-bit for the GB206. With the same 28 Gbps memory speed being used here, such a chip would end up with 448 GB/s of memory bandwidth.

At the entry level, there is the GB207, a significantly smaller chip with just 2,560 CUDA cores, across 10 SM, spanning two GPCs of 5 TPCs, each. The memory bus width is unchanged at 128-bit, but the memory type used is the older generation GDDR6. Assuming NVIDIA uses 18 Gbps memory speeds, it ends up with 288 GB/s on tap.

NVIDIA is expected to double down on large on-die caches on all its chips, to cushion the memory sub-systems. We expect there to be several other innovations in the areas of ray tracing performance, AI acceleration, and certain other features exclusive to the architecture. The company is expected to debut the series some time in Q4-2024.
Source: kopite7kimi (Twitter)
Add your own comment

141 Comments on Possible Specs of NVIDIA GeForce "Blackwell" GPU Lineup Leaked

#76
dgianstefani
TPU Proofreader
Markosz99% of that 88% NVIDIA owners don't have X080 or X090 cards. Most people have lower tier cards, 50, 60, maybe a few 70s, where AMD is actually competent and better value/price. Most people just blindly buy into NVIDIA without any research, because that's what they hear everywhere. Like people still believing Intel CPUs are the only thing to buy even today, because they live in a cave or something..

If you check modern games, they almost all use over 12GB ram in 4K. There is nothing to argue over here, Nvidia is limiting the VRAM to up-sale people to a higher tier and more expensive card and limit how future-proof those cards are, just like when they release a new technology and limit it to only their newest series.
So these lower tier cards most people have, would you also say they tend to have 4K monitors?

Seems most people have 1080p or 1440p...



From TPU's latest 4070S review.
Alan Wake 2 RT @ 4K is what I would call an edge case.



If you want to play native 4K RT, you'd best be buying a 4070 Ti S or better lol.
Posted on Reply
#77
Chrispy_
Markosz99% of that 88% NVIDIA owners don't have X080 or X090 cards. Most people have lower tier cards, 50, 60, maybe a few 70s
Yup.
Steam hardware survey had the GTX 1650 as the most popular GPU of 2023, and it was finally overtaken this year by the 3060, a 3-year-old GPU you can buy for $260, which is cheaper than even the cheapest 40-series offering.

AMD APUs and Intel IGPs have a far higher marketshare than the 4070Ti, 4080, 4090, and all three of those high-end 40-series GPUs combined are still outnumbered by 2016's GTX 1060.
dgianstefaniSo these lower tier cards most people have, would you also say they tend to have 4K monitors?
One thing to remember is that 4K monitors start at under $185 with good quality, reviewer-recommended models starting from about $250, and a monitor purchase can easily last a decade.

Someone can get themselves a decent 4K monitor and the cost increase over a half decent 1080p model may be under $100, which is only an extra $10 a year.

Only pure gamers who need to spend every single penny of their budget on GPUs will compromise their monitor purchase to try and buy a higher-tier GPU. For anyone who also works, enjoys movies, TV shows, or just likes lots of desktop real-estate, 4K monitors are both useful and extremely affordable. Nobody is forcing you to game at native resolution and now with DLSS/FSR/XeSS you can enjoy the benefits of a crisp, native resolution UI whilst the game itself renders at a resolution your GPU can handle, whatever that may be....
Posted on Reply
#78
Ruru
S.T.A.R.S.
Is a kilowatt PSU already needed for a single card?
Posted on Reply
#79
evernessince
32GB is really the minimum amount of memory the 5090 needs TBH. The 24GB on my 4090 is barely enough to train LoRAs for SDXL and running SDXL inference with IP adapters / LoRA pushes it as well. 32GB isn't enough to fine-tune the main model but at least it'll let the card keep up in other regards.

The lack of VRAM on cards lower on the stack would be expected but unfortunate. If I had to give people a tip, it would be that graphics have never been a deciding factor in whether a game is good or not and frankly returns are ever diminishing. It'd be one thing if we got actual AI in games (which would take up significant memory) that wasn't dumb as a bricks but slightly better looking graphics isn't worth taking the poison pills Nvidia wants you to swallow with it's VRAM on cards below the xx90 and their pricing.
Posted on Reply
#80
RedelZaVedno
The GB203 only 256-bit & 10,752 CUDA cores looks barely better than 4080specs and we all know more cache is not a silver bullet. It could br that 5080 will be no faster or even a bit slower than 4090 in rasterization at 4K and above. No cempetition = no progress :banghead:
Posted on Reply
#81
Unregistered
If the 5080 is what it takes to match a 4090 then I give up.
I'm not dropping over a grand on that crap and still buying a 4K monitor...
Posted on Edit | Reply
#82
ARF
evernessince32GB is really the minimum amount of memory the 5090 needs TBH. The 24GB on my 4090 is barely enough to train LoRAs for SDXL and running SDXL inference with IP adapters / LoRA pushes it as well. 32GB isn't enough to fine-tune the main model but at least it'll let the card keep up in other regards.

The lack of VRAM on cards lower on the stack would be expected but unfortunate. If I had to give people a tip, it would be that graphics have never been a deciding factor in whether a game is good or not and frankly returns are ever diminishing. It'd be one thing if we got actual AI in games (which would take up significant memory) that wasn't dumb as a bricks but slightly better looking graphics isn't worth taking the poison pills Nvidia wants you to swallow with it's VRAM on cards below the xx90 and their pricing.
Yes, but those are not typical gaming loads, this is something that you would want from a workstation card, not a GeForce.

I agree that the VRAM on the lower models is not enough, and the GB203 specs are too low.
Posted on Reply
#83
neatfeatguy
ChomiqYou'd be surprised how often have I heard "Aaaaand AMD display driver just crashed" from my buddy rocking a 6600 XT on a new AM5 system while playing the same game online.
And you'd be surprised when my brother was running his 5700XT and I was on my 3080 when we'd be playing coop and my game would crash to desktop for me while his wouldn't....

What's your point?
Posted on Reply
#84
JohH
Only the "5090" may be worth buying.
But I'm planning to skip it.
Posted on Reply
#85
ARF
neatfeatguyAnd you'd be surprised when my brother was running his 5700XT and I was on my 3080 when we'd be playing coop and my game would crash to desktop for me while his wouldn't....

What's your point?
Maybe that the older RDNA 2 cards have some type of issues running on the new AM5 platform... Who knows.
Chrispy_Jesus, GB206 (presumably 5060 and 5060Ti) are still 128-bit and still potentially 8GB. Nvidia are expecting GDDR7 to do some heavy lifting there and they've already complained about Micron GDDR7 not living up to claims, whilst Micron themselves aren't shouting their success from the rooftops.

IMO, there's no reason for these three variants to exist at all. I am 100% certain Nvidia will screw everyone on VRAM anyway, as part of the upsell:


The 5090 is 50% more compute than a 4090, so it'll cost $3000. Nvidia will probably say it's 60% faster which is why they can justify the asking price, which is irrelevant because China will buy them all for AI anyway and demand will outstrip demand until something better than 5090s comes along for AI datacenter racks.
Err, nope.
GB202 = 28GB
GB203 = 24GB
GB205 = 20GB
GB206 = 16GB
GB207 = 12GB

This is ok now.
Posted on Reply
#86
Isaak
Daven
  • The internet myth that AMD has fundamental driver problems and Nvidia doesn't
Man doesn't it just suck when an Internet myth breaks my display after updating my RX580's drivers a couple of years ago. I sure hope this myth gets relegated to a wives' tale sometime soon, that might mean this non-existent problem affects even less fictional users.
Posted on Reply
#87
Vayra86
RedelZaVednoThe GB203 only 256-bit & 10,752 CUDA cores looks barely better than 4080specs and we all know more cache is not a silver bullet. It could br that 5080 will be no faster or even a bit slower than 4090 in rasterization at 4K and above. No cempetition = no progress :banghead:
GDDR7. So these cards are def moving up a little bit. Emphasis on little, I guess
IsaakMan doesn't it just suck when an Internet myth breaks my display after updating my RX580's drivers a couple of years ago. I sure hope this myth gets relegated to a wives' tale sometime soon, that might mean this non-existent problem affects even less fictional users.
Pics or it didn't happen ;) A driver... for a GPU, breaking a display?
Posted on Reply
#88
Daven
IsaakMan doesn't it just suck when an Internet myth breaks my display after updating my RX580's drivers a couple of years ago. I sure hope this myth gets relegated to a wives' tale sometime soon, that might mean this non-existent problem affects even less fictional users.
Why does everyone think that if their computer crashes, every computer in the world is also crashing? I'm sorry to hear about your problems. I hope you find a solution on the web or through some third party. Please do not forget that AMD has sold millions and millions of GPUs through discrete cards, iGPUs, consoles, smartphones, gambling machines, etc. There are no reported widespread outages because if there were, it would be breaking news on major media platforms. Your issues are yours. Not mine and not others.
Posted on Reply
#89
Tigerfox
ARFErr, nope.
GB202 = 28GB
GB203 = 24GB
GB205 = 20GB
GB206 = 16GB
GB207 = 12GB
Ehm, nope? First generation of GDDR7 will only have the same 16Gb capacity as GDDR6(X) has now, 24Gb modules will only come later. We might se GPU with 1,5x VRAM later in the life cycle of Blackwell or with a refresh, but certainly not from the start. So GB203=16GB max or 32GB via clamshell.
Posted on Reply
#90
N/A
I can't see the full GB203 being more than 15% faster than a 4080 Super in raster, and that's barely worthy of a 5070 Ti moniker
Posted on Reply
#91
Gameslove
RTX 5090 can run Hellblade 2 Senua's Saga at 8K max settings with no DLSS, to achieve avr. 30?
Posted on Reply
#92
Isaak
DavenWhy does everyone think that if their computer crashes, every computer in the world is also crashing? I'm sorry to hear about your problems. I hope you find a solution on the web or through some third party. Please do not forget that AMD has sold millions and millions of GPUs through discrete cards, iGPUs, consoles, smartphones, gambling machines, etc. There are no reported widespread outages because if there were, it would be breaking news on major media platforms. Your issues are yours. Not mine and not others.
Never did I claim that all AMD products immediately catch on fire and kick the nearest puppy as you imply. You can read, you know this, there is no reason to exaggerate.
It is, although, curious that some people consider AMD drivers' bad reputation to be due to a "myth" when the amount of complaints is visibly superior to NVIDIA users. I've switched to NVIDIA shortly after and had no such episode since. Such a stark difference might sound ridiculous, but yeah, not once did I have a significant issue after updating NVIDIA drivers.

This is my personal experience and I notice more people complain about AMD drivers than NVIDIA's. Do with that information as you will.
Vayra86Pics or it didn't happen ;) A driver... for a GPU, breaking a display?
Would that I could, As said, this was years ago back in the RTX 20 era, no pics now. The monitor didn't explode, but there was no image displayed anymore. It was a frustrating experience figuring out what was going on, but I eventually switched to iGPU and was able to rollback or uninstall the latest Radeon drivers, don't remember which one was the solution.
Posted on Reply
#93
RGAFL
The 5080 is going to be about 10-20% short of a 4090 game dependent. This is on late beta hardware. 5080's will be starting at roughly £1250 and 5090's about £1900. There is also a new DLSS version coming out but it may not be ready for launch. Keep it quiet though.
Posted on Reply
#94
neatfeatguy
IsaakWould that I could, As said, this was years ago back in the RTX 20 era, no pics now. The monitor didn't explode, but there was no image displayed anymore. It was a frustrating experience figuring out what was going on, but I eventually switched to iGPU and was able to rollback or uninstall the latest Radeon drivers, don't remember which one was the solution.
That doesn't sound like the drivers broke your display. If you moved to the iGPU and the display worked, then your display wasn't broken. Sounds like a driver issue for you.

Don't feel bad, everyone has had some kind of annoyance from minor to major issues with a driver on Nivida or AMD's side over the their lifetime of using them. Anyone that say otherwise is lying because as much as some people would believe, not every driver will work with every hardware/software configuration flawlessly.

I've had issues with Nvidia drivers that:
  • broke video playback (nothing would play back in videos, just a black screen)
  • caused stuttering
  • caused flickering of images
  • failed to allow secondary and tertiary monitors from waking up after going to sleep (the image wouldn't come back on them, I had to power cycle the computer to get the image to come back)
  • Caused all blacks (shadows) in games to have green flickering
  • Caused instability with SLI configurations
  • gave me random driver failures with crashes to desktop
I learned some time ago that if a driver works and I don't get issues playing games, I have zero reasons to update the driver. There are times I run a driver for a year or more before I have to update it because I'm doing a re-install of Windows or swapping GPUs for whatever reason or maybe a new game I'm playing just won't run well without updated drivers.
Posted on Reply
#95
ARF
TigerfoxEhm, nope? First generation of GDDR7 will only have the same 16Gb capacity as GDDR6(X) has now, 24Gb modules will only come later. We might se GPU with 1,5x VRAM later in the life cycle of Blackwell or with a refresh, but certainly not from the start. So GB203=16GB max or 32GB via clamshell.
12 GB VRAM = 6 x 2GB chips.
16 GB VRAM = 8 x 2GB chips.
20 GB VRAM = 10 x 2GB chips.
24 GB VRAM = 12 x 2GB chips.
28 GB VRAM = 7 x 2 x 2GB via clamshell.
Posted on Reply
#96
Tigerfox
@ARF : Yeah, I know. GB203 will be 8x32Bit, so 16GB. 20/24/28GB without clamshell will requiere GB202.

Edit: not 6x32Bit but 8x32Bit. I hope...
Posted on Reply
#97
Isaak
neatfeatguyThat doesn't sound like the drivers broke your display. If you moved to the iGPU and the display worked, then your display wasn't broken. Sounds like a driver issue for you.

I learned some time ago that if a driver works and I don't get issues playing games, I have zero reasons to update the driver. There are times I run a driver for a year or more before I have to update it because I'm doing a re-install of Windows or swapping GPUs for whatever reason or maybe a new game I'm playing just won't run well without updated drivers.
I used to do that too but so often do games get a "day 1 driver" release that I'm just always updated now. Might go on another abstinence streak now that you mention it.

Dunno what term is correct, but the screen was turned on with a black screen. No image. Maybe "the display broke" is not accurate, but that's what I meant.
Posted on Reply
#98
ARF
Tigerfox@ARF : Yeah, I know. GB203 will be 8x32Bit, so 16GB. 20/24/28GB without clamshell will requiere GB202.

Edit: not 6x32Bit but 8x32Bit. I hope...
Well, GB203 can be 6x32-bit, so 24GB ;)
Posted on Reply
#99
Tigerfox
@ARF : But only with clamshell and that would not make any sense.
Posted on Reply
#100
ARF
Tigerfox@ARF : But only with clamshell and that would not make any sense.
GB203 with only 16GB will run out of VRAM / or will need software hacks to lower the image quality / textures resolution in a game like Microsoft Flight Simulator, which already saturate the 20-GB RX 7900 XT.



Posted on Reply
Add your own comment
Nov 22nd, 2024 03:17 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts