Saturday, March 9th 2024

NVIDIA RTX 50-series "GB20X" GPU Memory Interface Details Leak Out

Earlier in the week it was revealed that NVIDIA had distributed next-gen AI GPUs to its most important ecosystem partners and customers—Dell's CEO expressed enthusiasm with his discussion of "Blackwell" B100 and B200 evaluation samples. Team Green's next-gen family of gaming GPUs have received less media attention in early 2024—a mid-February TPU report pointed to a rumored PCIe 6.0 CEM specification for upcoming RTX 50-series cards, but leaks have become uncommon since late last year. Top technology tipster, kopite7kimi, has broken the relative silence on Blackwell's gaming configurations—an early hours tweet posits a slightly underwhelming scenario: "although I still have fantasies about 512 bit, the memory interface configuration of GB20x is not much different from that of AD10x."

Past disclosures have hinted about next-gen NVIDIA gaming GPUs sporting memory interface configurations comparable to the current crop of "Ada Lovelace" models. The latest batch of insider information suggests that Team Green's next flagship GeForce RTX GPU—GB202—will stick with a 384-bit memory bus. The beefiest current-gen GPU AD102—as featured in GeForce RTX 4090 graphics cards—is specced with a 384-bit interface. A significant upgrade for GeForce RTX 50xx cards could arrive with a step-up to next-gen GDDR7 memory—kopite7kimi reckons that top GPU designers will stick with 16 Gbit memory chip densities (2 GB). JEDEC officially announced its "GDDR7 Graphics Memory Standard" a couple of days ago. VideoCardz has kindly assembled the latest batch of insider info into a cross-generation comparison table (see below).
Sources: kopite7kimi Tweet, VideoCardz, Tom's Hardware, Wccftech, Tweak Town
Add your own comment

41 Comments on NVIDIA RTX 50-series "GB20X" GPU Memory Interface Details Leak Out

#1
Dragam1337
If this is true, then it is rather disapointing. Will be 3rd flagship card in a row with 24gb vram, which is a limiting factor already in some games at 8k.
Posted on Reply
#2
Onasi
Dragam1337If this is true, then it is rather disapointing. Will be 3rd flagship card in a row with 24gb vram, which is a limiting factor already in some games at 8k.
Ah yes, gaming at 8K, a totally reasonable and sane workload. Not 5K which is actually the most likely high end standard to emerge after 4K is mainstream in… a decade maybe? Like, come on.
And are AMD GPUs doing significantly better at 8K with their wider buses? I have no idea, I am not in a habit of checking 12 FPS benchmarks for completely unrealistic scenarios.
Posted on Reply
#3
sephiroth117
Dragam1337If this is true, then it is rather disapointing. Will be 3rd flagship card in a row with 24gb vram, which is a limiting factor already in some games at 8k.
I speak for me but I really don't give a toss about 8K gaming, I'm planning on purchasing a pc 4K monitor this year and my playstation 5 can barely output 30 fps at native 4K lol

I don't see the point of engineering and making your customers pay for > 24GB ram for a use case that would satisfy the 0.001% (0.000001% ?) that actually own a 8K display :).

Once GPU can comfortably master 4K, then we can evolve to 8K, we are really not there yet, at all.
Posted on Reply
#4
GodisanAtheist
Not surprising.

I'd even bet on L2 cache sizes shrinking or at least staying the same.

New GDDR generation means all the workarounds for bandwidth to shader ratios can be alleviated for a short while.

Don't need bigger busses or caches. Clamshell or 3gb chips if you need more capacity.
Posted on Reply
#5
Dragam1337
OnasiAh yes, gaming at 8K, a totally reasonable and sane workload. Not 5K which is actually the most likely high end standard to emerge after 4K is mainstream in… a decade maybe? Like, come on.
And are AMD GPUs doing significantly better at 8K with their wider buses? I have no idea, I am not in a habit of checking 12 FPS benchmarks for completely unrealistic scenarios.
sephiroth117I speak for me but I really don't give a toss about 8K gaming, I'm planning on purchasing a pc 4K monitor this year and my playstation 5 can barely output 30 fps at native 4K lol

I don't see the point of engineering and making your customers pay for > 24GB ram for a use case that would satisfy the 0.001% (0.000001% ?) that actually own a 8K display :).

Once GPU can comfortably master 4K, then we can evolve to 8K, we are really not there yet, at all.
It gets so tiring with people spewing nonesense about stuff they obviously have no clue about, or experience with...


Posted on Reply
#6
Onasi
Dragam1337As seen above i AM playing at 8k, and it works great in alot of titles, but some are limited at vram...
*sigh* So what level of DLSS do you have engaged on this “8K” screenshot? Looks like Quality, if I am not mistaken? With a 4090 to boot. At 60 FPS. Barely acceptable. Because when I talk about 8K I am strictly talking in terms of native res. Obviously, upsampling changes the game.
Oh, and Quality DLSS is 5K by the way. Exactly what I was talking about in terms of a next render target. We are more in agreement than not, it seems.
Posted on Reply
#7
Dragam1337
Onasi*sigh* So what level of DLSS do you have engaged on this “8K” screenshot? Looks like Quality, if I am not mistaken? With a 4090 to boot. At 60 FPS. Barely acceptable. Because when I talk about 8K I am strictly talking in terms of native res. Obviously, upsampling changes the game.
Oh, and Quality DLSS is 5K by the way. Exactly what I was talking about in terms of a next render target. We are more in agreement than not, it seems.
Ofc im using dlss - it would be idiotic not to, as at 8k there basically aint a difference in quality. And as essentially all games support it these days, yes, 8k is very much doable.

As for 60 fps barely being acceptable - well you just stay at 1080p then together with the other plebs. Meanwhile i will enjoy glorious 8k graphics ;)
Posted on Reply
#8
Onasi
@Dragam1337
Ah, so you aren’t actually interested in having a discussion on GPU specs, frametime costs and how resolution scaling affects those going forward. You are just shitposting. Fair enough, carry on.
Posted on Reply
#9
Dragam1337
Onasi@Dragam1337
Ah, so you aren’t actually interested in having a discussion on GPU specs, frametime costs and how resolution scaling affects those going forward. You are just shitposting. Fair enough, carry on.
Evidently you are confusing yourself with me.
Posted on Reply
#10
N/A
They are exactly the same. not much different means the same. what could be different. dont tell me they consider 352 bits variants again.
Posted on Reply
#11
trsttte
The 384bits on GB202 is fine, even the GB203 with 256bits is ok, not great not terrible. What's impressive and really concerning is to see GB205 with 192bits, so we'll have the same clown show of '70 cards having only 12gb of VRAM? Outstanding :nutkick:

Or maybe they use GB203 for '70 class and '80 gets GB202 and they retire the '90 again? Or maybe 24Gb GDDR7 chips will change the math? Oh well, still more than 6 months to go so not really much point in speculating, whatever it ends at it will certainly be the worst possible option as usual
Posted on Reply
#12
evernessince
OnasiAh yes, gaming at 8K, a totally reasonable and sane workload. Not 5K which is actually the most likely high end standard to emerge after 4K is mainstream in… a decade maybe? Like, come on.
And are AMD GPUs doing significantly better at 8K with their wider buses? I have no idea, I am not in a habit of checking 12 FPS benchmarks for completely unrealistic scenarios.
I'm not so sure games in the next 3 years won't use 24GB at lower resolutions than 8K. It's more than a possibility. Flagship graphics cards are not supposed to be at their limit in consumer applications right out of the gate, especially when you are talking an xx90 class card. Extra VRAM enables future games to push what they can do, it doesn't have to have an immediate and obvious impact. The 1080 Ti is a great example of that, the card is still relevant today thanks to it's 11GB of VRAM.

Mind you IMO the biggest problem with Nvidia sticking with 24GB for it's flagship card would be that it curtails it's usefulness for AI. I can already reach 32GB VRAM usage at 1024x1024 on SDXL, never mind newer AI models that are currently in testing that will certainly be out by the time this GPU drops. Nvidia's next gen cards can be amazing performance wise but if AMD is releasing it's next gen flagship with 36 or 48GB for example that's going to attrack a lot of high end customers over to them.
Posted on Reply
#13
Onasi
evernessinceI'm not so sure games in the next 3 years won't use 24GB at lower resolutions than 8K. It's more than a possibility. Flagship graphics cards are not supposed to be at their limit in consumer applications right out of the gate, especially when you are talking an xx90 class card. Extra VRAM enables future games to push what they can do, it doesn't have to have an immediate and obvious impact. The 1080 Ti is a great example of that, the card is still relevant today thanks to it's 11GB of VRAM.
The games will use what is available. Of course, in cases of extremely shit optimization it’s possible to gobble up essentially endless VRAM. And just as possible is for developers to implement absurd graphics features or nonsense like, I dunno, 8K texture packs for their Ultra Nightmare Eldritch Horror settings preset that would put even a hypothetical 5090Ti Super Ultra to its knees. But the truth is, the vast majority of the market won’t run top tier 2000 bucks cards. Consoles also don’t have 24 gigs of memory and unlikely to with the refreshes. As such, no developer who would actually like their games to sell would push for insane VRAM usage (not just allocation for cache) targets. I think most people on this enthusiast oriented site forget (perhaps understandably) that Ultra settings at 4K with Path Tracing is more of a tech demo for the GPU market to show off and the big money customers to feel good about their purchase rather than the developers actually intended way for the majority of people to experience the game.
Posted on Reply
#14
Hugis
Ok you 2 two, Keep it on topic ! and stop bickering !
If you need to sling crap around do it by PM.
Posted on Reply
#15
evernessince
OnasiThe games will use what is available. Of course, in cases of extremely shit optimization it’s possible to gobble up essentially endless VRAM. And just as possible is for developers to implement absurd graphics features or nonsense like, I dunno, 8K texture packs for their Ultra Nightmare Eldritch Horror settings preset that would put even a hypothetical 5090Ti Super Ultra to its knees. But the truth is, the vast majority of the market won’t run top tier 2000 bucks cards. Consoles also don’t have 24 gigs of memory and unlikely to with the refreshes. As such, no developer who would actually like their games to sell would push for insane VRAM usage (not just allocation for cache) targets. I think most people on this enthusiast oriented site forget (perhaps understandably) that Ultra settings at 4K with Path Tracing is more of a tech demo for the GPU market to show off and the big money customers to feel good about their purchase rather than the developers actually intended way for the majority of people to experience the game.
Consoles are different, they have a dedicated decompression chip that allows them to stream assets from disk with low latency and they benefit from closer to metal optimizations. A game like the new ratchet and clank has to take up more VRAM and memory on PC because it cannot assume the storage subsystem can stream assets in a timely enough manner whereas on console it's guaranteed.

And again, for AI 24GB is simply not enough for a flagship card that should be able to run next gen models. I have a 4090 in my stable diffusion rig and I will not upgrade that to a 5090 if they aren't increasing the VRAM size. If AMD comes out with a card with more VRAM I'd likely upgrade to that instead, particular as ROCm has been making strides performance wise. I can't say I see the logic in Nvidia remaining stagnant for 3 generations in a row. That's silly given the continual push for AI.
Posted on Reply
#16
gffermari
Unless they use 256bit and 4GB chips...So we get 32GB of VRAM. I don't know how the bandwidth is affected though and the price as well. Are the 4GB chips that much more expensive than 2x2GB?

Definitely nVidia will never give 384bit, 4GB chips aka 48GB in a consumer gaming gpu, anytime soon.
Posted on Reply
#17
close
evernessinceI'm not so sure games in the next 3 years won't use 24GB at lower resolutions than 8K. It's more than a possibility. Flagship graphics cards are not supposed to be at their limit in consumer applications right out of the gate, especially when you are talking an xx90 class card. Extra VRAM enables future games to push what they can do, it doesn't have to have an immediate and obvious impact. The 1080 Ti is a great example of that, the card is still relevant today thanks to it's 11GB of VRAM.

Mind you IMO the biggest problem with Nvidia sticking with 24GB for it's flagship card would be that it curtails it's usefulness for AI. I can already reach 32GB VRAM usage at 1024x1024 on SDXL, never mind newer AI models that are currently in testing that will certainly be out by the time this GPU drops. Nvidia's next gen cards can be amazing performance wise but if AMD is releasing it's next gen flagship with 36 or 48GB for example that's going to attrack a lot of high end customers over to them.
You're totally right, Nvidia is rentseeking and sticking to their VRAM like Intel was sticking to quad core even on their high end because AMD was not able to put any real pressure, much like in the GPU market today. Worse, Nvidia's moat is even bigger than Intel's used to be. Nvidia also doesn't want for consumer GPUs to ever be too good at AI stuff when they sell a card with a 10% higher BOM but 10x higher price.

But the "but 8K, the humanity" argument is one of those useless e-peen arguments. 4K gaming has been around for some time and still not prevalent in the PC space. 8K won't be a real concern for enough people to matter in the next many years. The February 2024 Steam survey says ~60% have a 1080p monitor resolution and another ~20% have 1440p. 4K is in the low single digits. And 8K is that one guy above. On DLSS. At 60FPS. If 8K gaming is your only or even just primary concern then the 24GB VRAM you buy today is going to be a true limitation around the time you need to upgrade anyway.
Posted on Reply
#18
MentalAcetylide
Dragam1337It gets so tiring with people spewing nonesense about stuff they obviously have no clue about, or experience with...
Posted on Reply
#19
Metroid
OnasiAh yes, gaming at 8K, a totally reasonable and sane workload. Not 5K which is actually the most likely high end standard to emerge after 4K is mainstream in… a decade maybe? Like, come on.
And are AMD GPUs doing significantly better at 8K with their wider buses? I have no idea, I am not in a habit of checking 12 FPS benchmarks for completely unrealistic scenarios.
While I like this push to 8K, I agree with you, 3 generations or 10 years to get to 120 fps at 8k. I do believe 5090 will achieve in some modern games 8k 60fps and 4k 200fps.
Posted on Reply
#20
starfals
I was one of the first people to buy a 4K display in like ages ago. People back then said the same things for 4K that we (even myself) say about 8K. Who knows, the jump might be too great this time. Then again, 4K was a lot more demanding than 1080p and 1440p. Especially 1080p, back then every monitor and TV was mostly focusing on that. TVs are a good example of this huge jump. They went from 1080p to 4K directly. 4K was slow to happen too, then every single TV and every single second gaming monitor had that resolution. Consoles had it too. It's like a 4K switch was flipped, and the planet was suddenly all ready. I kept reading FOR YEARS how there is NO 4K content, movies,TV channels and games supporting it. YEARS. Look where we are now. It's EVERYWHERE! So yeah, who knows? 8K might happen, it is the next step, and we always keep going up when it comes to resolutions. I have yet to see a higher new resolution that was out there and not adopted eventually. Biggest issue atm is lack of content and displays at good prices. I sure ain't gonna buy a first gen 8K TVs. These are terrible atm, and expensive. Bad combo. The first 4K display i got, was actually very good. I still use it today. I doubt i will use (for a long time) the first 8K laggy, bad contrast, bad reflections, bad CPU TV. They say it's a chicken and egg kind of issue. Well, we kind of need both at the same time. Why buy a TV/monitor if you can't use it for gaming or movies? So yeah, we need better video cards too. Nvidia are again disappointing us on that front. They might be right, cus it doesn't seem to be the time for that yet.. but someone has to push it. TV makers ain't, movies ain't... so perhaps gaming?
Posted on Reply
#21
Minus Infinity
OnasiAh yes, gaming at 8K, a totally reasonable and sane workload. Not 5K which is actually the most likely high end standard to emerge after 4K is mainstream in… a decade maybe? Like, come on.
And are AMD GPUs doing significantly better at 8K with their wider buses? I have no idea, I am not in a habit of checking 12 FPS benchmarks for completely unrealistic scenarios.
But but DLSS 2+ DLSS3. I love me those fake frames in the morning.

But we are jumping the gun. Where is my 27" 8K OLED 480Hz monitor
Posted on Reply
#22
Macro Device
What matters more is will bang per buck and bang per watt improve (<25% doesn't count ffs). I don't even have hope at this point. Why bother about VRAM if calculating power of hypothetical 5060 Ti (at 500ish USD) is roughly on par with 550ish USD 4070 non-Super? And this, considering AMD are unwilling to compete, is the most likely scenario.

P.S. 8K will be a thing in 2030s. We aren't even 50% through the 4K gaming era.
Posted on Reply
#23
Redwoodz
Rumours based on multiple misquotes. Dell exec never said he had them in hand, he said he was excited about what it will bring, for AI. Whatever it will be, it will be designed for AI first.
Posted on Reply
#24
watzupken
starfalsI was one of the first people to buy a 4K display in like ages ago. People back then said the same things for 4K that we (even myself) say about 8K. Who knows, the jump might be too great this time. Then again, 4K was a lot more demanding than 1080p and 1440p. Especially 1080p, back then every monitor and TV was mostly focusing on that. TVs are a good example of this huge jump. They went from 1080p to 4K directly. 4K was slow to happen too, then every single TV and every single second gaming monitor had that resolution. Consoles had it too. It's like a 4K switch was flipped, and the planet was suddenly all ready. I kept reading FOR YEARS how there is NO 4K content, movies,TV channels and games supporting it. YEARS. Look where we are now. It's EVERYWHERE! So yeah, who knows? 8K might happen, it is the next step, and we always keep going up when it comes to resolutions. I have yet to see a higher new resolution that was out there and not adopted eventually. Biggest issue atm is lack of content and displays at good prices. I sure ain't gonna buy a first gen 8K TVs. These are terrible atm, and expensive. Bad combo. The first 4K display i got, was actually very good. I still use it today. I doubt i will use (for a long time) the first 8K laggy, bad contrast, bad reflections, bad CPU TV. They say it's a chicken and egg kind of issue. Well, we kind of need both at the same time. Why buy a TV/monitor if you can't use it for gaming or movies? So yeah, we need better video cards too. Nvidia are again disappointing us on that front. They might be right, cus it doesn't seem to be the time for that yet.. but someone has to push it. TV makers ain't, movies ain't... so perhaps gaming?
4K is still niche when you consider the fact that most gamers are still on 1080p or 1440p. And if you observed, while the likes of Ampere and Ada launched as a good step towards 4K gaming, that did not last long. And from 2023, even the flagship could not play most new games at native resolution. So if maintaining smooth framerates at 4K is bad, you can imagine the challenge with 8K resolution. Upscaled 4K is basically running at 1080p or 1440p.
Going back to the topic, it is not unexpected that Nvidia will not really change their product specs. If the high margined data center or AI graphics are not getting top end specs, like higher VRAM, etc, you can imagine their care for gamers is even lesser. Probably right at the bottom of their priority list.
Posted on Reply
#25
ARF
Dragam1337If this is true, then it is rather disapointing. Will be 3rd flagship card in a row with 24gb vram, which is a limiting factor already in some games at 8k.
The thing is that DisplayPort 1.4 which is offered by nvidia is not ready for 8K. Even DP 2.1 is not ready.
If you are going to stay with an nvidia card, you must wait for a DisplayPort 2.2 or 3.0.
OnasiAnd are AMD GPUs doing significantly better at 8K with their wider buses?
AMD will not offer a big Navi 4 next generation. The RX 7900 XTX will be their halo card till Navi 50 sometime in 2026-2027.
Dragam1337It gets so tiring with people spewing nonesense about stuff they obviously have no clue about, or experience with...
As seen above i AM playing at 8k, and it works great in alot of titles, but some are limited at vram...
Witcher 3 remaster with ultra raytracing is an example of a game where it runs fine at 8k - right until it runs out of vram, and fps absolutely tanks due to vram swapping.
closeThe February 2024 Steam survey says ~60% have a 1080p monitor resolution and another ~20% have 1440p. 4K is in the low single digits. And 8K is that one guy above.
This only goes to prove that people are stuck in 2010 and do not want to move forward. Which is a shame - they prefer the awful screen-door effect that those low-quality low-res monitors tend to offer.
Posted on Reply
Add your own comment
Dec 21st, 2024 23:09 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts