Monday, January 6th 2025

First NVIDIA GeForce RTX 5090 GPU with 32 GB GDDR7 Memory Leaks Ahead of CES Keynote

NVIDIA's unannounced GeForce RTX 5090 graphics card has leaked, confirming key specifications of the next-generation GPU. Thanks to exclusive information from VideoCardz, we can see the packaging of Inno3D's RTX 5090 iChill X3 model, which confirms that the graphics card will feature 32 GB of GDDR7 memory. The leaked materials show that Inno3D's variant will use a 3.5-slot cooling system, suggesting significant cooling requirements for the flagship card. According to earlier leaks, the RTX 5090 will be based on the GB202 GPU and include 21,760 CUDA cores. The card's memory system is a significant upgrade, with its 32 GB of GDDR7 memory running on a 512-bit memory bus at 28 Gbps, capable of delivering nearly 1.8 TB/s of bandwidth. This represents twice the memory capacity of the upcoming RTX 5080, which is expected to ship with 16 GB capacity but 30 Gbps GDDR7 modules.

Power consumption has increased significantly, with the RTX 5090's TDP rated at 575 W and TGP of 600 W, marking a 125-watt increase over the previous RTX 4090 in raw TDP. NVIDIA is scheduled to hold its CES keynote today at 06:30 pm PT time, where the company is expected to announce several new graphics cards officially. The lineup should include the RTX 5090, RTX 5080, RTX 5070 Ti, RTX 5070, and an RTX 5090D model specifically for the Chinese market. Early indications are that the RTX 5080 will be the first card to reach consumers, with a planned release date of January 21st. Release dates for other models, including the flagship RTX 5090, have not yet been confirmed. The RTX 5090 is currently the only card in the RTX 50 series planned to use the GB202 GPU. Pricing information and additional specifications are expected to be revealed during the upcoming announcement.
Source: VideoCardz
Add your own comment

82 Comments on First NVIDIA GeForce RTX 5090 GPU with 32 GB GDDR7 Memory Leaks Ahead of CES Keynote

#76
kapone32
HakkerIt actually has better green credentials. Solar parks reflect more light than they absorb those also warming up the planet nor is production of the solar cells green for that regard. The same goes for the amount of birds that die by windmill farms. Nuclear is in fact greener than solar. Hydro is the cleanest of them all, but pretty limited in usage and has it's own environmental impacts.
Then you still have the problem of scaling your energy up or down. which is far more difficult with both options than with a nuclear plant. If you want the world to really go green then nuclear is an unavoidable power source at this time unless fusion power makes a breakthrough.

The problem with nuclear plants is that all people see is Chernobyl. A badly designed nuclear plant from the start and then they forget it happened because of bad management/operational error. At the same time it was somewhat a blessing in disguise because the world learned so enormously much from what not to do and it's implications of when it goes horribly wrong.
Three Mile Island, Acid Rain. Do you have any idea how many Nuclear Power Plants ring the Great Lakes? The biggest problem with Nuclear is the waste. Depleted Uranium covered bullets and tanks are not the solution. For those that don't know the rods are made of Uranium. Chernobyl introduced new Cancers after it happened. Fukushima happened in the Ocean and the Ocean is pure Chemistry in my eyes. When that Oil Platform the Americans built in the Gulf Of Mexico blew up one of the biggest surprises was that the Ocean has an algae that eats oil.
Posted on Reply
#77
Prima.Vera
TheinsanegamerNWhy is that nonsense? Gaming is a wasteful hobby. Why do you need 150 watt to play bing bing wahoo? Hell why do you need 75 watt? The switch only pulls like 10 at load, that's all you need for video games.

You say it's nonsense because YOU want a GPU with more than 75w of capability. Funny how that works. Once you start down the road of "you dont need this", it WILL be taken to its logical extreme. Remember the UK's potato peeler license debacle?

I'd love to hear what this sensible argument for a NEED of GPUs more than 75 watt to play games with is.

No video card on the consumer side draws more than 1kW of power. Not even close. The highest was the 600w the 3090ti could tickle. If you're talking whole systems, buddy, SLI gaming PCS were pushing 2KW back in 2008. Somehow you all survived.

a 1kW card poses no more of a risk then a 500w card does, or a 300w card. If you have such an old building, you should be upgrading the wiring instead of buying GPUs to waste time gaming on.

Here in "rednek ville" we have modern 25 amp wiring with 20 amp breakers that can run a 5090 without burning the house down. Interesting that this is such a concern in the EU, with all those "very smart people" banning everything. Why are you even allowed to have a building with wiring that cant handle a 1000w load? That's literal knob and tube style wiring.

If you have such widespread problems with wiring, the solution is NOT to ban GPUs, because you're gonna have to ban fridges, microwaves, and vacuum cleaners too. What you SHOULD be doing is banning that old wiring. This is why the EU gets referred to as a "nanny state".
Sorry, but amazingly, you misunderstood EVERYTHING I've just wrote. You're acting like a typical American spoiled brat, thinking only he is right, so I'm not going to further entertain your agenda.
For the rest of the readers, I'm just saying that anything with such high power draw, there should be some minimum regulations in place, otherwise those burning connectors won't be the last thing that might start to burn, if you have those kind of microwave oven like monsters, that are gradually going to 1KW power draw.
Just imagine in not even 10 years, having a 1KW GPU, a 500W CPU, 100W for 2xSSDs, etc :)))
Posted on Reply
#78
mechtech
"The leaked materials show that Inno3D's variant will use a 3.5-slot cooling system"

not 4?!?!
not sure I believe it ;)
Posted on Reply
#79
onemanhitsquad
I am going to enjoy every watt of the Founders Edition.
Posted on Reply
#80
Dimitriman
DavenHow do you get a 50% increase in performance between 5090 and 4090 with only a 33% increase in CUDA cores when the 4090 only beat the 4080 by 25% with a 67% increase in cores?
You are right, which is why he increased the price by 33% exactly.
Posted on Reply
#81
Daven
TheinsanegamerNAre we basing this bet on Nvidia's numbers, or on TPUs? If TPUs when does the card actually release?
Given the Nvidia keynote with only RT/AI/DLSS scores, we will have to wait until the TPU review unless you really believe what Nvidia says and the 5090 is double the performance of the 4090 in which case you win and everyone else losses.
Posted on Reply
#82
sephiroth117
TheinsanegamerNNo, the 5090 is not a "professional" card. It doenst have fully enabled FP output nor does it have ECC memory or pro driver support. It DOES have 32 GB of memory, because its got a 512 bit bus and the only other option would be 16GB, which would have sent the internet into a hysterical frenzy. As a side benefit, you can tinker with LLM and AI image generation on this. Yay!

If you cant afford or dont want a 5090 that's fine, go buy a 5070 and enjoy your game library. Or a RX 9070. Or a intel B580. I swear, if people whined like this a decade ago we never would have gotten anything larger than a GTX 550ti.
  • No, the 4090 already DOES have ECC, just not enabled in the control panel by default...no doubt the 5090 will too :). It most definitely can be a professional card, what you can do, professionally speaking, with a 5090 will be pretty advanced. Just like it is today for the 4090...you will tell me that the ECC on the 4090 was for gaming too ?.
  • The GDDR7 speed is very relevant and important here but for me you got it back-ways: it's 512-bits because it has 32GB of memory, just like the 24GB (12 x 2GB/32bits) of the 4090 makes it 384bits. We are really not bandwidth saturated on a 4090 at 4K. Unless MFR is ultra intensive on the bandwidth, I fail to see why...But my main point you kinda missed is that we either have 16GB...And 256bit on the 5080 (but slightly faster memclock to compensate apparently) or 32GB/512 on the 5090, it's really a big shame that there's no in-between here like a 24GB/384bit 5080ti or something. It's the lack of choice, 16GB is not enough for me, it's not future proof for 4K/High, yet, 32GB is too much for gaming and I suspect it's for AI and the stocks will be depleted because of it.
I don't purchase AMD, only for CPUs. I plan on getting a 5090 pending independent reviews,..because I do not have a choice vRAM wise, exactly my point.
However "a decade ago" we were not dealing with tons of scalpers just to get a damn nvidia high-end card also, people whined at least as much in the past, I was quite young but the Ati vs Nvidia, the GTX 970 and its 3.5GB of performance vRAM etc. yet we ended up with something slightly larger than a GTX 550ti :)...
Posted on Reply
Add your own comment
Jan 8th, 2025 13:23 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts