Thursday, December 26th 2024

NVIDIA GeForce RTX 5090 Powered by "GB202" Silicon, 512-bit GDDR7, ASIC Pictured

Here is the first picture of what is very likely the GeForce RTX 5090 "Blackwell," the successor to the RTX 4090 "Ada." The picture, its massive GPU, and layout appear to confirm the weekend's bare PCB leak. The RTX 5090 is powered by the "GB202" silicon, the largest gaming GPU based on the "Blackwell" graphics architecture. The silicon in the picture has the ASIC code "GB202-300-A1." From this ASIC code, we can deduce that the RTX 5090 may not max out the silicon (i.e. enable all SM present on it), as maxed-out NVIDIA ASICs tend to have the variant designation "450."

The "GB202" ASIC is surrounded by sixteen GDDR7 memory chips, which reportedly make the 32 GB memory size of the RTX 5090. The chip count, coupled with the large GPU package size (high pin-count), confirm that the "GB202" features a 512-bit wide memory bus. Assuming a memory speed of 28 Gbps, this memory bus should yield a stellar memory bandwidth of 1,792 GB/s. The GPU and memory are surrounded by the card's 24-phase VRM solution. This draws power from a single 16-pin 12V-2x6 power connector. NVIDIA will likely max out the 600 W continuous power-delivery capability of the connector, and give the card a TGP of around 500-550 W, if not more.
Source: harukaze5719 (Twitter)
Add your own comment

43 Comments on NVIDIA GeForce RTX 5090 Powered by "GB202" Silicon, 512-bit GDDR7, ASIC Pictured

#26
freeagent
I haven't done my income tax in about 4-5 years, my return is supposed to be in the thousands, very plural. Should almost be enough for this :D
Posted on Reply
#27
AusWolf
wolfThe Titan(s), for whatever cumulative reasons but surely price for one, didn't sell well enough apparently.

Reintroduce that tier of card as a XX90 and start the price lower, then gradually ratchet it up.
That's why I advise everybody not to pay any attention to product names whatsoever. The only things of importance are specs, performance and price.
Posted on Reply
#28
MxPhenom 216
ASIC Engineer
SOAREVERSORIt's always been GeForce in name only for the Titan and X090 series. It's a prosumer card for data stuff at home hence the specs especially VRAM. 2000 even 4000 is still cheap to that crowd. The heat doesn't matter nor does the power consumption as really what these are going to see is the same as the 4090 and prior cards. 4-8 of them slapped into a box with waterblocks (Camino had those out before we knew the cards specs and was taking orders) dual PSUs, Threadripper/Xeon platforms with massive internal radiators or what's becoming the norm ports out to external radiators with the option for a rackmount of the same product.

All the way back to the 8800gtx nvidia was stating CUDA was their future. When the first Titan hit the selling point was it's use in professional instances it wasn't until companies like Falcon NW decided it was SLI in a card slammed into an ITX case and people like Linus started buying them up that it being a "gaming" card took hold.

GeForce doesn't even really mean gaming. Tons of companies deploy GeForce based laptops that will never game but use the GPUs for other professional things that do not need the Quadro drivers or the price associated with those.
This.

I build workstations for a multi million dollar CAD company and software they use recommends Geforce so thats what we put in the systems. 3090s, 4090s, and next few will be 5090s.
Posted on Reply
#29
marios15
agent_x007Yup, and get NV price bonus of between x2 and x3 vs. GDDR7 version.
I mean, of course :D
But you can take the 150W and put it into the core or keep the power draw reasonable
Posted on Reply
#31
harm9963
For 4090 owners , like me , dropping a 5090 in our system , already set for it , so it comes down to which model to get , FE or PNY , reviews is what I wait for !
Posted on Reply
#32
Visible Noise
harm9963For 4090 owners , like me , dropping a 5090 in our system , already set for it , so it comes down to which model to get , FE or PNY , reviews is what I wait for !
Although I'll lust after a 5090 it will be a very long time before I can justify replacing my 4090. I'm already CPU limited in most situations.
Posted on Reply
#33
harm9963
Visible NoiseAlthough I'll lust after a 5090 it will be a very long time before I can justify replacing my 4090. I'm already CPU limited in most situations.
fill out your System Specs .
Posted on Reply
#35
Wirko
Dirt Chip512 bit will be the challenge,
said the credit card.
Posted on Reply
#36
harm9963
Not FE ? , more like reference card.
Posted on Reply
#37
Sir_Coleslaw
The most interesting question for me at the moment is which manufacturers will offer a 5090 with a pre-installed water block and when. I'm already reckoning with a price of €2500+ anyway. But the sale of my current 4090 will pay for a large part of that.
Posted on Reply
#38
AusWolf
Sir_ColeslawBut the sale of my current 4090 will pay for a large part of that.
Unless Nvidia pushes for another price hike, I would count on used 4090s flooding the market and their prices plummeting as soon as the 5090 is out.
Posted on Reply
#39
Vya Domus
mtosevAnyone got an idea how much a 5090 will cost?
"If you have to ask it's not for you" dollars.
Posted on Reply
#40
tpa-pr
nguyenOh well my 4090 churns through games @ 4K barely using 250-300W (stock FE clocks & undervolted). Not sure if the 5090 will even break a sweat running today games @ locked 4K 144hz, would definitely need a 4K 240hz monitor to stress the 5090.
Is that 4K native or with DLSS? I'm curious.
Posted on Reply
#41
nguyen
tpa-prIs that 4K native or with DLSS? I'm curious.
That is with DLSS, at 4K native the GPU will use about 30w more
Posted on Reply
#42
Dawora
matar600Watts if not more WOW looks like for those who have 850W psu and 1000W psu make sure too add a 1200+psu alongside with the 5090 order.
Ppls realy dont know the difference of TGP and TDP?
Hyderzits very neatly packed... but wouldnt it be better if the components was more spread out... looks like the thing will run super hot...
Maybe nvidia engineers knows better than us
Prima.Vera550W TDP :laugh: :laugh: :laugh: :kookoo: :kookoo: :kookoo:
Or 400w
But it will have better perf/w than Amd
Posted on Reply
#43
WonkoTheSaneUK
Vya Domus"If you have to ask it's not for you" dollars.
Probably even more, if Nvidia are paying extra to speed up production lines, so they can fill warehouses before the new administration's anti-China tariffs kick in on Jan 20th.
Posted on Reply
Add your own comment
Dec 27th, 2024 11:01 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts