Saturday, March 9th 2024
NVIDIA RTX 50-series "GB20X" GPU Memory Interface Details Leak Out
Earlier in the week it was revealed that NVIDIA had distributed next-gen AI GPUs to its most important ecosystem partners and customers—Dell's CEO expressed enthusiasm with his discussion of "Blackwell" B100 and B200 evaluation samples. Team Green's next-gen family of gaming GPUs have received less media attention in early 2024—a mid-February TPU report pointed to a rumored PCIe 6.0 CEM specification for upcoming RTX 50-series cards, but leaks have become uncommon since late last year. Top technology tipster, kopite7kimi, has broken the relative silence on Blackwell's gaming configurations—an early hours tweet posits a slightly underwhelming scenario: "although I still have fantasies about 512 bit, the memory interface configuration of GB20x is not much different from that of AD10x."
Past disclosures have hinted about next-gen NVIDIA gaming GPUs sporting memory interface configurations comparable to the current crop of "Ada Lovelace" models. The latest batch of insider information suggests that Team Green's next flagship GeForce RTX GPU—GB202—will stick with a 384-bit memory bus. The beefiest current-gen GPU AD102—as featured in GeForce RTX 4090 graphics cards—is specced with a 384-bit interface. A significant upgrade for GeForce RTX 50xx cards could arrive with a step-up to next-gen GDDR7 memory—kopite7kimi reckons that top GPU designers will stick with 16 Gbit memory chip densities (2 GB). JEDEC officially announced its "GDDR7 Graphics Memory Standard" a couple of days ago. VideoCardz has kindly assembled the latest batch of insider info into a cross-generation comparison table (see below).
Sources:
kopite7kimi Tweet, VideoCardz, Tom's Hardware, Wccftech, Tweak Town
Past disclosures have hinted about next-gen NVIDIA gaming GPUs sporting memory interface configurations comparable to the current crop of "Ada Lovelace" models. The latest batch of insider information suggests that Team Green's next flagship GeForce RTX GPU—GB202—will stick with a 384-bit memory bus. The beefiest current-gen GPU AD102—as featured in GeForce RTX 4090 graphics cards—is specced with a 384-bit interface. A significant upgrade for GeForce RTX 50xx cards could arrive with a step-up to next-gen GDDR7 memory—kopite7kimi reckons that top GPU designers will stick with 16 Gbit memory chip densities (2 GB). JEDEC officially announced its "GDDR7 Graphics Memory Standard" a couple of days ago. VideoCardz has kindly assembled the latest batch of insider info into a cross-generation comparison table (see below).
41 Comments on NVIDIA RTX 50-series "GB20X" GPU Memory Interface Details Leak Out
And are AMD GPUs doing significantly better at 8K with their wider buses? I have no idea, I am not in a habit of checking 12 FPS benchmarks for completely unrealistic scenarios.
I don't see the point of engineering and making your customers pay for > 24GB ram for a use case that would satisfy the 0.001% (0.000001% ?) that actually own a 8K display :).
Once GPU can comfortably master 4K, then we can evolve to 8K, we are really not there yet, at all.
I'd even bet on L2 cache sizes shrinking or at least staying the same.
New GDDR generation means all the workarounds for bandwidth to shader ratios can be alleviated for a short while.
Don't need bigger busses or caches. Clamshell or 3gb chips if you need more capacity.
Oh, and Quality DLSS is 5K by the way. Exactly what I was talking about in terms of a next render target. We are more in agreement than not, it seems.
As for 60 fps barely being acceptable - well you just stay at 1080p then together with the other plebs. Meanwhile i will enjoy glorious 8k graphics ;)
Ah, so you aren’t actually interested in having a discussion on GPU specs, frametime costs and how resolution scaling affects those going forward. You are just shitposting. Fair enough, carry on.
Or maybe they use GB203 for '70 class and '80 gets GB202 and they retire the '90 again? Or maybe 24Gb GDDR7 chips will change the math? Oh well, still more than 6 months to go so not really much point in speculating, whatever it ends at it will certainly be the worst possible option as usual
Mind you IMO the biggest problem with Nvidia sticking with 24GB for it's flagship card would be that it curtails it's usefulness for AI. I can already reach 32GB VRAM usage at 1024x1024 on SDXL, never mind newer AI models that are currently in testing that will certainly be out by the time this GPU drops. Nvidia's next gen cards can be amazing performance wise but if AMD is releasing it's next gen flagship with 36 or 48GB for example that's going to attrack a lot of high end customers over to them.
If you need to sling crap around do it by PM.
And again, for AI 24GB is simply not enough for a flagship card that should be able to run next gen models. I have a 4090 in my stable diffusion rig and I will not upgrade that to a 5090 if they aren't increasing the VRAM size. If AMD comes out with a card with more VRAM I'd likely upgrade to that instead, particular as ROCm has been making strides performance wise. I can't say I see the logic in Nvidia remaining stagnant for 3 generations in a row. That's silly given the continual push for AI.
Definitely nVidia will never give 384bit, 4GB chips aka 48GB in a consumer gaming gpu, anytime soon.
But the "but 8K, the humanity" argument is one of those useless e-peen arguments. 4K gaming has been around for some time and still not prevalent in the PC space. 8K won't be a real concern for enough people to matter in the next many years. The February 2024 Steam survey says ~60% have a 1080p monitor resolution and another ~20% have 1440p. 4K is in the low single digits. And 8K is that one guy above. On DLSS. At 60FPS. If 8K gaming is your only or even just primary concern then the 24GB VRAM you buy today is going to be a true limitation around the time you need to upgrade anyway.
But we are jumping the gun. Where is my 27" 8K OLED 480Hz monitor
P.S. 8K will be a thing in 2030s. We aren't even 50% through the 4K gaming era.
Going back to the topic, it is not unexpected that Nvidia will not really change their product specs. If the high margined data center or AI graphics are not getting top end specs, like higher VRAM, etc, you can imagine their care for gamers is even lesser. Probably right at the bottom of their priority list.
If you are going to stay with an nvidia card, you must wait for a DisplayPort 2.2 or 3.0. AMD will not offer a big Navi 4 next generation. The RX 7900 XTX will be their halo card till Navi 50 sometime in 2026-2027. This only goes to prove that people are stuck in 2010 and do not want to move forward. Which is a shame - they prefer the awful screen-door effect that those low-quality low-res monitors tend to offer.