• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GB300 "Blackwell Ultra" Will Feature 288 GB HBM3E Memory, 1400 W TDP

AleksandarK

News Editor
Staff member
Joined
Aug 19, 2017
Messages
2,659 (0.99/day)
NVIDIA "Blackwell" series is barely out with B100, B200, and GB200 chips shipping to OEMs and hyperscalers, but the company is already setting in its upgraded "Blackwell Ultra" plans with its upcoming GB300 AI server. According to UDN, the next generation NVIDIA system will be powered by the B300 GPU chip, operating at 1400 W and delivering a remarkable 1.5x improvement in FP4 performance per card compared to its B200 predecessor. One of the most notable upgrades is the memory configuration, with each GPU now sporting 288 GB of HBM3e memory, a substantial increase from the previous 192 GB of GB200. The new design implements a 12-layer stack architecture, advancing from the GB200's 8-layer configuration. The system's cooling infrastructure has been completely reimagined, incorporating advanced water cooling plates and enhanced quick disconnects in the liquid cooling system.

Networking capabilities have also seen a substantial upgrade, with the implementation of ConnectX 8 network cards replacing the previous ConnectX 7 generation, while optical modules have been upgraded from 800G to 1.6T, ensuring faster data transmission. Regarding power management and reliability, the GB300 NVL72 cabinet will standardize capacitor tray implementation, with an optional Battery Backup Unit (BBU) system. Each BBU module costs approximately $300 to manufacture, with a complete GB300 system's BBU configuration totaling around $1,500. The system's supercapacitor requirements are equally substantial, with each NVL72 rack requiring over 300 units, priced between $20-25 per unit during production due to its high-power nature. The GB300, carrying Grace CPU and Blackwell Ultra GPU, also introduces the implementation of LPCAMM on its computing boards, indicating that the LPCAMM memory standard is about to take over servers, not just laptops and desktops. We have to wait for the official launch before seeing LPCAMM memory configurations.



View at TechPowerUp Main Site | Source
 
Joined
Jun 24, 2017
Messages
181 (0.07/day)
HBM3E
288GB
etc..
and people is still discussing if 16GB is enough or if GDDR7Xplusultra will bring bandwidth to the consumer market if 384bit bus is used.

What I only see is that they've translated the industrial cost price per computed unit to the the consumer market.
 
Joined
Dec 12, 2016
Messages
1,958 (0.67/day)
HBM3E
288GB
etc..
and people is still discussing if 16GB is enough or if GDDR7Xplusultra will bring bandwidth to the consumer market if 384bit bus is used.

What I only see is that they've translated the industrial cost price per computed unit to the the consumer market.
RAM comparisons between GPU compute and gaming GPUs is like comparing an orange to a Klingon. They have nothing to do with the other.
 
Joined
Jan 9, 2023
Messages
328 (0.46/day)
HBM3E
288GB
etc..
and people is still discussing if 16GB is enough or if GDDR7Xplusultra will bring bandwidth to the consumer market if 384bit bus is used.

What I only see is that they've translated the industrial cost price per computed unit to the the consumer market.
It's almost as if Nvidia is doing this intentionally to prevent consumer sales from eating into their b2b products. Crazy.
 
Joined
Oct 12, 2005
Messages
715 (0.10/day)
RAM comparisons between GPU compute and gaming GPUs is like comparing an orange to a Klingon. They have nothing to do with the other.
I disagree. HBM memory have already been found on gaming GPU and we will probably see it again in the future when price will go down.

The main reason why Nvidia isn't putting more RAM on their gaming GPU is because they don't want to affect their margin too much. Ram scaling have slowed down drastically and we no longer see the big increase of density we saw in the past. Stacking is probably the only option there but it's still costly. Until price come down, we will stagnate.

This and also, Nvidia whole marketing strategy is to make you feel sorry for not buying a *090 class GPU.

They can put that much memory on these chips because they sell them at crazy price. Their margin is still increasily good. But the gamers have now to adapt to the new reality that both Radeon and Geforce primary client is no longer customers, but business with deep pocket.

Gamers not will just get leftovers. Get used to it.
 

Ruru

S.T.A.R.S.
Joined
Dec 16, 2012
Messages
12,997 (2.96/day)
Location
Jyväskylä, Finland
System Name 4K-gaming / media-PC
Processor AMD Ryzen 7 5800X / Intel Core i7-6700K
Motherboard Asus ROG Crosshair VII Hero / Asus Z170-K
Cooling Alphacool Eisbaer 360 / Alphacool Eisbaer 240
Memory 32GB DDR4-3466 / 16GB DDR4-3000
Video Card(s) Asus RTX 3080 TUF OC / Powercolor RX 6700 XT
Storage 3.3TB of SSDs / several small SSDs
Display(s) Acer 27" 4K120 IPS + Lenovo 32" 4K60 IPS
Case Corsair 4000D AF White / DeepCool CC560 WH
Audio Device(s) Sony WH-CN720N
Power Supply EVGA G2 750W / Fractal ION Gold 550W
Mouse Logitech MX518 / Logitech G400s
Keyboard Roccat Vulcan 121 AIMO / NOS C450 Mini Pro
VR HMD Oculus Rift CV1
Software Windows 11 Pro / Windows 11 Pro
Benchmark Scores They run Crysis
I disagree. HBM memory have already been found on gaming GPU and we will probably see it again in the future when price will go down.

The main reason why Nvidia isn't putting more RAM on their gaming GPU is because they don't want to affect their margin too much. Ram scaling have slowed down drastically and we no longer see the big increase of density we saw in the past. Stacking is probably the only option there but it's still costly. Until price come down, we will stagnate.

This and also, Nvidia whole marketing strategy is to make you feel sorry for not buying a *090 class GPU.

They can put that much memory on these chips because they sell them at crazy price. Their margin is still increasily good. But the gamers have now to adapt to the new reality that both Radeon and Geforce primary client is no longer customers, but business with deep pocket.

Gamers not will just get leftovers. Get used to it.
Nah, dunno... Fury and Vega were pretty meh, even with HBM compared to Ngreedia cards of the same price range.

Though not gonna lie, they were way more interesting than 900/1000 series' cards even though they weren't as fast.
 
Joined
Oct 12, 2005
Messages
715 (0.10/day)
Fury and Vega used HBM way too early, a bit like RDNA 3 was too early on Chiplets.

That seems like a tendance on AMD side to early adopt newer manufacturing tech in the hope to gain an advantage. It do not seems to have work yet for them and I suspect that it is related to the purge AMD made in their GPU division. (This and also the fact they will go with just one uArch in the future).

But denser stacked memory will come. And sooner than later, the memory will be on package.GDDRX memory already need to be really close to it. At some point you can't have huge speed over long distances.

In the next decade, we will certainly see GPU package with multi die and stacked memory. The huge initial cost is currently being paid by those DC/cloud provider and other AI shop.

It's sad that there is currently stagnation. But those high end tech will be one day in reach for Gaming folks.
 
Joined
Jun 14, 2020
Messages
3,540 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
I disagree. HBM memory have already been found on gaming GPU and we will probably see it again in the future when price will go down.

The main reason why Nvidia isn't putting more RAM on their gaming GPU is because they don't want to affect their margin too much. Ram scaling have slowed down drastically and we no longer see the big increase of density we saw in the past. Stacking is probably the only option there but it's still costly. Until price come down, we will stagnate.

This and also, Nvidia whole marketing strategy is to make you feel sorry for not buying a *090 class GPU.

They can put that much memory on these chips because they sell them at crazy price. Their margin is still increasily good. But the gamers have now to adapt to the new reality that both Radeon and Geforce primary client is no longer customers, but business with deep pocket.

Gamers not will just get leftovers. Get used to it.
I think the main reason nvidia isn't adding much vram isn't to screw with gamers, they want to screw with professionals that don't care that much about compute performance but need the vram. As a consequence, yes, gamers are also screwed. Nvidia is trying to solve the issue by some "AI textures" crap so they can keep selling low vram cards so pros move to the higher end gpus.
 
Joined
Feb 20, 2019
Messages
8,341 (3.91/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
LOL, talking about the $25 cost per supercapacitor and $1500 rack cost of batteries seems like they're ignoring the elephant(s) in the room:
  • A rack of GB200s costs about 3 million bucks, if you are placing an order big enough to let you negotiate a good price (i.e, you are Apple).
  • One single GB200 blade? $60,000 for just the GPU and no supporting hardware.
  • A rack of GB200s uses about 120 kW!
I am fairly certain GB200 won't suddenly stop being relevant, so expect GB300 to cost more by at least the proportional difference in performance, and more realistically a huge premium on top of that.
 
Top