Tuesday, June 11th 2024

Possible Specs of NVIDIA GeForce "Blackwell" GPU Lineup Leaked

Possible specifications of the various NVIDIA GeForce "Blackwell" gaming GPUs were leaked to the web by Kopite7kimi, a reliable source with NVIDIA leaks. These are specs of the maxed out silicon, NVIDIA will carve out several GeForce RTX 50-series SKUs based on these chips, which could end up with lower shader counts than those shown here. We've known from older reports that there will be five chips in all, the GB202 being the largest, followed by the GB203, the GB205, the GB206, and the GB207. There is a notable absence of a successor to the AD104, GA104, and TU104, because NVIDIA is trying a slightly different way to approach the performance segment with this generation.

The GB202 is the halo segment chip that will drive the possible RTX 5090 (RTX 4090 successor). This chip is endowed with 192 streaming multiprocessors (SM), or 96 texture processing clusters (TPCs). These 96 TPCs are spread across 12 graphics processing clusters (GPCs), which each have 8 of them. Assuming that "Blackwell" has the same 256 CUDA cores per TPC that the past several generations of NVIDIA gaming GPUs have had, we end up with a total CUDA core count of 24,576. Another interesting aspect about this mega-chip is memory. The GPU implements the next-generation GDDR7 memory, and uses a mammoth 512-bit memory bus. Assuming the 28 Gbps memory speed that was being rumored for NVIDIA's "Blackwell" generation, this chip has 1,792 GB/s of memory bandwidth on tap!
The GB203 is the next chip in the series, and poised to be a successor in name to the current AD103. It generationally reduces the shader counts, counting on the architecture and clock speeds to more than come through for performance; while retaining the 256-bit bus width of the AD103. The net result could be a significantly smaller GPU than the AD103, for better performance. The GB203 is endowed with 10,752 CUDA cores, spread across 84 SM (42 TPCs). The chip has 7 GPCs, each with 6 TPCs. The memory bus, as we mentioned, is 256-bit, and at a memory speed of 28 Gbps, would yield 896 GB/s of bandwidth.

The GB205 will power the lower half of the performance segment in the GeForce "Blackwell" generation. This chip has a rather surprising CUDA core count of just 6,400, spread across 50 SM, which are arranged in 5 GPCs of 5 TPCs, each. The memory bus width is 192-bit. For 28 Gbps, this would result in 672 GB/s of memory bandwidth.

The GB206 drives the mid-range of the series. This chip gets very close to matching the CUDA core count of the GB205, with 6,144 of them. These are spread across 36 SM (18 TPCs). The 18 TPCs span 3 GPCs of 6 TPCs, each. The key differentiator between the GB205 and GB206 is memory bus width, which is narrowed to 128-bit for the GB206. With the same 28 Gbps memory speed being used here, such a chip would end up with 448 GB/s of memory bandwidth.

At the entry level, there is the GB207, a significantly smaller chip with just 2,560 CUDA cores, across 10 SM, spanning two GPCs of 5 TPCs, each. The memory bus width is unchanged at 128-bit, but the memory type used is the older generation GDDR6. Assuming NVIDIA uses 18 Gbps memory speeds, it ends up with 288 GB/s on tap.

NVIDIA is expected to double down on large on-die caches on all its chips, to cushion the memory sub-systems. We expect there to be several other innovations in the areas of ray tracing performance, AI acceleration, and certain other features exclusive to the architecture. The company is expected to debut the series some time in Q4-2024.
Source: kopite7kimi (Twitter)
Add your own comment

141 Comments on Possible Specs of NVIDIA GeForce "Blackwell" GPU Lineup Leaked

#51
Vayra86
TigerfoxAs @Assimilator said, that was always only one or two quarters in a row (Q2-Q3 2015, Q4 2018, not 2019, Q4 2020) and then since Q3 2022 the last seven quarters in a row.

What is it with that random line that isn't even a real line? Did you just fail at drawing a straight line from the first to thal last shown quarter or did you connect random quarters on purose?
Oh that was just my free hand 'straight line', don't think too much of it. The trend is clear anyway, isn't it. And yes, obviously if the trend is down, at some point you're gonna sink below your occasional bottom percentages. The point was, no matter what AMD has done over all those years, (rebrands, revivals, etc.) they never consistently clawed back share.

Is AMD a story of lost and wasted potential? Damn sure. I don't disagree on that. But it is what it is, and I do believe they have a strategy now that's working for them, with little reasons included to completely lose the PC market, even a small presence won't kill their margins on GPU, its just extra.
Posted on Reply
#52
Quicks
Hope they don't expect people to buy 8GB & 12GB cards this time around. They should really not release any 128Bit or even 192Bit cards.
Posted on Reply
#53
Tigerfox
Vayra86The trend is clear anyway, isn't it. And yes, obviously if the trend is down, at some point you're gonna sink below your occasional bottom percentages.
No, it is not and that's my point. Up until Q1 2020 there is much up and down and while the second quarter in the graphic, Q2 2014, was their highest, Q1 2020 was pretty high too. But even until Q1 2022 it was short but heavy downs and loong but slow ups.

The only trend I see is that they never get above 40% marketshare and that they have been below 20% for the last seven quarters - but even here, with ups of 5%-points.
Vayra86Is AMD a story of lost and wasted potential? Damn sure. I don't disagree on that. But it is what it is, and I do believe they have a strategy now that's working for them, with little reasons included to completely lose the PC market, even a small presence won't kill their margins on GPU, its just extra.
You can talk someone to death. They have certainly done better before, but theay have always come up with inovations to carve out their marketshare. RDNA3 hasn't been as successfull as hoped, but RDNA2 showed again that innovation can beat long term market domination to a certain point.

Since I remember the release of Radeon 9700 Pro and HD5870 that had NV overwhelmingly beat for months and also HD48x0 and the whole GCN1-3 series that didn't beat NV in every aspect but were so competitive that prices went down to an alltime low, I am certain AMD Radeon will rise up again.
Posted on Reply
#54
hsew
Vayra86Oh that was just my free hand 'straight line', don't think too much of it. The trend is clear anyway, isn't it. And yes, obviously if the trend is down, at some point you're gonna sink below your occasional bottom percentages. The point was, no matter what AMD has done over all those years, (rebrands, revivals, etc.) they never consistently clawed back share.

Is AMD a story of lost and wasted potential? Damn sure. I don't disagree on that. But it is what it is, and I do believe they have a strategy now that's working for them, with little reasons included to completely lose the PC market, even a small presence won't kill their margins on GPU, its just extra.
Is anybody actually buying those high margin 7900 cards? AMD’s current strategy seems to be competing with Intel. That’s not a good sign.
Posted on Reply
#55
Vayra86
TigerfoxNo, it is not and that's my point. Up until Q1 2020 there is much up and down and while the second quarter in the graphic, Q2 2014, was their highest, Q1 2020 was pretty high too. But even until Q1 2022 it was short but heavy downs and loong but slow ups.

The only trend I see is that they never get above 40% marketshare and that they have been below 20% for the last seven quarters - but even here, with ups of 5%-points.
Yeah you could also read that out of it. But another trend along the whole graph is that their peaks consistently lowered over time, and they bottom out in an ever shorter cadence, until they arrive at consistent below 18% share. They had some not-too-shitty years, but nowhere is there consistent growth YoY.

If anything those ups show that there is potential to regain market share. But selling a quarter or a few quarters 'better' than your usual trend of 'down YoY' isn't a positive. It just means you reacted to the market proper with either pricing or timing. And in AMD's case, its always pricing. Pricing under the competition.
Posted on Reply
#56
Onasi
QuicksHope they don't expect people to buy 8GB & 12GB cards this time around. They should really not release any 128Bit or even 192Bit cards.
The people will buy what is offered. Thinking otherwise is applying DIY tech enthusiast mindset to general consumers. And general consumer doesn’t really choose what he is buying, he is TOLD - by marketing, influencers, friends, just general mind share of a company is enough often. He has no fucking idea what a memory bus is. But he knows that NVidia = PC gaming. And, as such, even the most, from our perspective, crippled cards will absolutely sell.
Posted on Reply
#57
Vayra86
hsewIs anybody actually buying those high margin 7900 cards? AMD’s current strategy seems to be competing with Intel. That’s not a good sign.
I did, if that counts for anything haha. No regrets really, its working fine. And that's all it does, too.

But again, strategy is more than just the tiny slice of PC dGPU for consumers. GPU is bigger than that. For AMD, its also APU, its console business, its laptop products, etc.
And I do think Intel is their more direct competitor, especially now that they're also doing GPU; AMD has to stay ahead of them much more so than Nvidia, who's limited in the x86 space.
Posted on Reply
#58
Assimilator
Vayra86Oh that was just my free hand 'straight line', don't think too much of it. The trend is clear anyway, isn't it. And yes, obviously if the trend is down, at some point you're gonna sink below your occasional bottom percentages. The point was, no matter what AMD has done over all those years, (rebrands, revivals, etc.) they never consistently clawed back share.
Yes, but there is a time at which accountants and investors start to get twitchy and asking "is this particular line of business still worthwhile to be investing in, maybe we could and should be redirecting these resources to more profitable segments of the business"? And human nature being what it is, that time is likely going to align with a drop from double- to single-digit marketshare. According to the trend we're observing, that drop is likely going to happen within the next year. I can very conceivably see AMD decide to discontinue desktop GPUs is the question comes up at that point, and focus the resources currently being spent there on consoles and CPUs.

I don't want to see AMD exit the desktop GPU market, regardless of how many people call me a fanboy how many times, because I'm scared of what that would mean for consumers. I'm just presenting data and saying, this looks really bad, and AMD needs to change something to make it not bad - and endless forum posts accusing NVIDIA of being greedy, or anticompetitive, or whatever are not it. Nor is buying AMD products out of a misplaced sense of brand loyalty.
Posted on Reply
#59
Vayra86
AssimilatorYes, but there is a time at which accountants and investors start to get twitchy and asking "is this particular line of business still worthwhile to be investing in, maybe we could and should be redirecting these resources to more profitable segments of the business"? And human nature being what it is, that time is likely going to align with a drop from double- to single-digit marketshare. According to the trend we're observing, that drop is likely going to happen within the next year. I can very conceivably see AMD decide to discontinue desktop GPUs is the question comes up at that point, and focus the resources currently being spent there on consoles and CPUs.

I don't want to see AMD exit the desktop GPU market, regardless of how many people call me a fanboy how many times, because I'm scared of what that would mean for consumers. I'm just presenting data and saying, this looks really bad, and AMD needs to change something to make it not bad - and endless forum posts accusing NVIDIA of being greedy, or anticompetitive, or whatever are not it. Nor is buying AMD products out of a misplaced sense of brand loyalty.
Absolutely agreed but endless forum posts about AMD needing to do better haven't worked for them either :)
It is what it is, sometimes things need to turn to absolute shit before they get better.
Posted on Reply
#60
Chrispy_
ChaitanyaIt will be case of More you buy more you save with GB203 priced to make Gb202 more attractive.
Leatherjacket isn't going to live that quote down, is he? :)

It's like RT "just works" (except when it doesn't)
Vayra86But again, strategy is more than just the tiny slice of PC dGPU for consumers. GPU is bigger than that. For AMD, its also APU, its console business, its laptop products, etc.
And I do think Intel is their more direct competitor, especially now that they're also doing GPU; AMD has to stay ahead of them much more so than Nvidia, who's limited in the x86 space.
Yeah, Nvidia's bread and butter is currently datacentre compute.

AMD have made fantastic inroads to the servers market with EPYC but their enterprise/datacentre GPU solutions lack the CUDA support they need to gain any significant traction. I think Nvidia knew this and they've been playing the long game with the better part of two decades of investment in CUDA's software/API/ecosystem monopolisation.
Posted on Reply
#61
redeye
faster than a 4080?. a 5090 needs to be “price increase percentage” faster than a 4090. a 7900xtx is faster (tpu reviews, cp2077) than a 4080…
Posted on Reply
#62
Tigerfox
Vayra86But another trend along the whole graph is that their peaks consistently lowered over time, and they bottom out in an ever shorter cadence, until they arrive at consistent below 18% share. They had some not-too-shitty years, but nowhere is there consistent growth YoY.
You are a genius. The same is true for NV, since it's a duopol and NVs graph exactly mirrors AMDs. Only while AMDs never goes above 40%, NVs never goes below 60%.
Vayra86For AMD, its also APU, its console business, its laptop products, etc.
That's hopefully one big reason to continue developement in GPU.
AssimilatorI can very conceivably see AMD decide to discontinue desktop GPUs is the question comes up at that point, and focus the resources currently being spent there on consoles and CPUs.
That's what I fear might happen, too, but see above.
But perhaps we are interpreting AMDs chances to much from the highend point of view. After all, their marketshare was quite strong in the times of Polaris & Vega vs Pascal.
Posted on Reply
#63
Vayra86
Chrispy_Leatherjacket isn't going to live that quote down, is he? :)

It's like RT "just works" (except when it doesn't)


Yeah, Nvidia's bread and butter is currently datacentre compute.

AMD have made fantastic inroads to the servers market with EPYC but their enterprise/datacentre GPU solutions lack the CUDA support they need to gain any significant traction. I think Nvidia knew this and they've been playing the long game with the better part of two decades of investment in CUDA's software/API/ecosystem monopolisation.
I think people forget its a big, big pie here, and AMD's gotten itself quite a few slices of it at this point.

We're actually talking about them competing with Intel. And let's be real here: their CPU product is now better, their GPU product is lightyears ahead of them too. Hasn't Nvidia ALWAYS been something they couldn't quite catch? I'm not seeing a difference here in the overall perspective. Every time, even when, rarely, Nvidia did objectively worse, Nvidia won.
TigerfoxYou are a genius
Thanks! Luckily I can't detect the sarcasm here :)
Posted on Reply
#64
Onasi
Vayra86Absolutely agreed but endless forum posts about AMD needing to do better haven't worked for them either :)
It is what it is, sometimes things need to turn to absolute shit before they get better.
I still remember people unironically saying that AMD should sell the Radeon division to Samsung when that particular rumor was floating around the time when the two companies announced that AMDs graphics IP will be coming to Exynos SOCs.
Chrispy_I think Nvidia knew this and they've been playing the long game with the better part of two decades of investment in CUDA's software/API/ecosystem monopolisation.
Yes, it’s called being a forward looking tech company with a solid business plan that is focused on a holistic product - hardware, software, all the surrounding ecosystem.

Alternatively, it can be called an EVUL move from an EVUL anti-competitive company that is led by a Dark Lord in a Leather Jacket of EVULNESS +7 who is bent on not allowing gamers to play the latest AAA slop at highest detail out of sheer spite for poor AMD and their paladins of virtue representing them on the forums.

You know, either/or.
Posted on Reply
#65
hsew
Vayra86I think people forget its a big, big pie here, and AMD's gotten itself quite a few slices of it at this point.

We're actually talking about them competing with Intel. And let's be real here: their CPU product is now better, their GPU product is lightyears ahead of them too.
I don’t think dGPU marketshare is a priority for AMD at this point. And you can’t really say they’re lightyears ahead of Intel when Intel already beats them in RT. I mean what else can AMD really lean on in terms of software over Intel? At the rate Intel’s GPUs are improving AMD won’t be ahead for long, assuming Intel is actually serious about this market of course.
Posted on Reply
#66
Neo_Morpheus
The Ngreedia fanbois are really insane.

They make it sound like all AMD gpus are at least half as slow to their Ngreedia counterpart.

Yes, bring the RT nonsense and as stated, only influencers aka reviewers care about that.

Only 2 games (so far) are a decent sample of RT (Cyberpunk and Control) but doesnt add anything to gameplay, neither justifies the insane performance hit.
Posted on Reply
#67
Vayra86
hsewI don’t think dGPU marketshare is a priority for AMD at this point. And you can’t really say they’re lightyears ahead of Intel when Intel already beats them in RT. I mean what else can AMD really lean on in terms of software over Intel? At the rate Intel’s GPUs are improving AMD won’t be ahead for long, assuming Intel is actually serious about this market of course.
Where is Intel faster in RT? Relatively perhaps... but they have yet to make a truly fast GPU. Scaling up is exactly their biggest challenge. That's how their initial product failed so hard and long: they couldn't scale properly, basing themselves on IGPU technology, and then even when they rewrote the blueprint, all we got was A770.

Absolute, raw performance is the only real indicator. Because even RT performance will be based on that; you can accelerate all you want, but you can't accelerate past your raster/raw perf capability.

Also, alongside scaling their GPU up, there is the matter of die space/cost. How big is their die? How big is AMD's? And that's where chiplets come in. If AMD can improve that further, they have a technological advantage here, and the first release isn't horrible with RDNA3. Its not perfect. But it still moved the ball forward.
Posted on Reply
#68
Daven
ChomiqYou'd be surprised how often have I heard "Aaaaand AMD display driver just crashed" from my buddy rocking a 6600 XT on a new AM5 system while playing the same game online.
And if your buddy was the only Radeon customer in existence, I would declare AMD drivers 100% defunct. But since AMD has sold millions and millions of Radeons since their inception, I'm not going to worry about your buddy's problems too much.
Posted on Reply
#69
Tigerfox
Vayra86Where is Intel faster in RT? Relatively perhaps...
Relatively, they're even faster than Nvidia (just looked at one test, in rasterizer, 4060 is 11% faster than 770 in 1440p, with RT it's only 1%). But yeah, scaling is their problem and the reason I wasn't invested in their first GPU generation at all.
Scaling was AMDs problem more than once, too.
Posted on Reply
#70
Prince Valiant
TheDeeGeeSucks to be you, but Path Tracing is the future of videogame lighting, even AMD will have to optimize for it.
When should we expect "the future" to arrive? It's been over half a decade since the RTX line launched and RT performance still stinks.
Posted on Reply
#71
sethmatrix7
Prince ValiantWhen should we expect "the future" to arrive? It's been over half a decade since the RTX line launched and RT performance still stinks.
Not only is performance lackluster, but meaningful RT implementation is limited to a handful of singleplayer titles.

Looks like Star Wars Outlaws will have some sort of RT implementation, and I look forward to seeing how that is.
Posted on Reply
#72
Vayra86
TigerfoxRelatively, they're even faster than Nvidia (just looked at one test, in rasterizer, 4060 is 11% faster than 770 in 1440p, with RT it's only 1%). But yeah, scaling is their problem and the reason I wasn't invested in their first GPU generation at all.
Scaling was AMDs problem more than once, too.
I am still convinced that besides this, RT is an overblown thing and the end result of that technology will become known within engine technologies, not as poster child 'muh RTX is ON' bullshit. We're still early adopting this tech, and it could go any number of ways, the only real consistent way I see right now is stuff like Nanite, or the implementation that CryEngine shows us in Neon Noir. Visually remarkably close to 'real RT (ahem... with denoising and numerous other tweaks, so define real...)', but without a hard performance hit, I can even run that on Pascal.

Especially if you want cross platform compatibility, which is rapidly becoming a must especially for anything that also release outside the PC camp (but even within, think handhelds!), please do explain to me how you're going to guzzle extra power to enable RT proper if you haven't even got that to show the best of raster graphics.

It ain't happening. Literally every market movement except the one Nvidia tries to convince us of, is moving in the opposite direction. RT will only work if you get your game from the cloud. So, how does that mix exactly with Nvidia selling you 1500 dollar GPUs I wonder? Where is this long term RT perspective on dGPU?
Posted on Reply
#73
Dristun
Neo_MorpheusThe Ngreedia fanbois are really insane.

They make it sound like all AMD gpus are at least half as slow to their Ngreedia counterpart.

Yes, bring the RT nonsense and as stated, only influencers aka reviewers care about that.

Only 2 games (so far) are a decent sample of RT (Cyberpunk and Control) but doesnt add anything to gameplay, neither justifies the insane performance hit.
Clearly the only DIY enthusiasts who are able to think for themselves are the ones still buying AMD cards, everyone else has been programmed. Are nVidia's influencers using MKUltra tech in their videos? Or is that LSD in the water supply that makes games with proper RT implementation look so good?
Posted on Reply
#74
theouto
TheDeeGeeSucks to be you, but Path Tracing is the future of videogame lighting, even AMD will have to optimize for it.
And they are, but the future is not the present, and tell me a gpu that can properly run pathtracing on a modern games without any trickery, and tell me how many of those modern games have pathtracing to begin with.
Besides, not only are we a ways off, I'd argue that games still look plenty good without path tracing, I can wait.
Posted on Reply
#75
Markosz
AssimilatorI really wish NVIDIA had decided to increase the VRAM capacity and bus width over Ada. Not because more VRAM and a wider bus actually does anything for performance, but because it would at least stop Radeon fanboys crying about how NVIDIA is screwing buyers over. News flash, the 88% of people who own an NVIDIA GPU only feel screwed over by AMD's inability to compete.
99% of that 88% NVIDIA owners don't have X080 or X090 cards. Most people have lower tier cards, 50, 60, maybe a few 70s, where AMD is actually competent and better value/price. Most people just blindly buy into NVIDIA without any research, because that's what they hear everywhere. Like people still believing Intel CPUs are the only thing to buy even today, because they live in a cave or something..

If you check modern games, they almost all use over 12GB ram in 4K. There is nothing to argue over here, Nvidia is limiting the VRAM to up-sale people to a higher tier and more expensive card and limit how future-proof those cards are, just like when they release a new technology and limit it to only their newest series.
Posted on Reply
Add your own comment
Nov 25th, 2024 07:06 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts