Monday, January 6th 2025
First NVIDIA GeForce RTX 5090 GPU with 32 GB GDDR7 Memory Leaks Ahead of CES Keynote
NVIDIA's unannounced GeForce RTX 5090 graphics card has leaked, confirming key specifications of the next-generation GPU. Thanks to exclusive information from VideoCardz, we can see the packaging of Inno3D's RTX 5090 iChill X3 model, which confirms that the graphics card will feature 32 GB of GDDR7 memory. The leaked materials show that Inno3D's variant will use a 3.5-slot cooling system, suggesting significant cooling requirements for the flagship card. According to earlier leaks, the RTX 5090 will be based on the GB202 GPU and include 21,760 CUDA cores. The card's memory system is a significant upgrade, with its 32 GB of GDDR7 memory running on a 512-bit memory bus at 28 Gbps, capable of delivering nearly 1.8 TB/s of bandwidth. This represents twice the memory capacity of the upcoming RTX 5080, which is expected to ship with 16 GB capacity but 30 Gbps GDDR7 modules.
Power consumption has increased significantly, with the RTX 5090's TDP rated at 575 W and TGP of 600 W, marking a 125-watt increase over the previous RTX 4090 in raw TDP. NVIDIA is scheduled to hold its CES keynote today at 06:30 pm PT time, where the company is expected to announce several new graphics cards officially. The lineup should include the RTX 5090, RTX 5080, RTX 5070 Ti, RTX 5070, and an RTX 5090D model specifically for the Chinese market. Early indications are that the RTX 5080 will be the first card to reach consumers, with a planned release date of January 21st. Release dates for other models, including the flagship RTX 5090, have not yet been confirmed. The RTX 5090 is currently the only card in the RTX 50 series planned to use the GB202 GPU. Pricing information and additional specifications are expected to be revealed during the upcoming announcement.
Source:
VideoCardz
Power consumption has increased significantly, with the RTX 5090's TDP rated at 575 W and TGP of 600 W, marking a 125-watt increase over the previous RTX 4090 in raw TDP. NVIDIA is scheduled to hold its CES keynote today at 06:30 pm PT time, where the company is expected to announce several new graphics cards officially. The lineup should include the RTX 5090, RTX 5080, RTX 5070 Ti, RTX 5070, and an RTX 5090D model specifically for the Chinese market. Early indications are that the RTX 5080 will be the first card to reach consumers, with a planned release date of January 21st. Release dates for other models, including the flagship RTX 5090, have not yet been confirmed. The RTX 5090 is currently the only card in the RTX 50 series planned to use the GB202 GPU. Pricing information and additional specifications are expected to be revealed during the upcoming announcement.
82 Comments on First NVIDIA GeForce RTX 5090 GPU with 32 GB GDDR7 Memory Leaks Ahead of CES Keynote
This is the Leather Jacket way.
The 5090 has 33% more CUDA cores than the 4090 and just under double the bandwidth.
So unless games are bandwidth starved, we shouldn’t expect much over 20% performance increase. I don’t expect an increase in core clocks given the 33% rise in cores matches the 30% rise in power on the same process node.
Who cares about "standard" ATX cases anyway? All the new builds out there I see on social media are using modern vertical GPU mounts or are the square split design. Traditional ATX cases are very rare. Even my 2009 era HAF X has plenty of side room for these modern monsters. You mean the same "power use" regulations that have made modern appliances 5 year throw away garbage? THAT's you're go to for why regulation is good?
Also, it's "efficiency", not "power use". Setting hard power limits would make large refrigerators, or freezers, or large washing machines impossible to build. So if the EU decided that you dont need more then 75w for gaming, since it is an entirely wasteful hobby, how hard would you have to force yourself to smile while buying a $1000 RX 6400? Just curious. And they do. You can buy anything from a RTX 4060 to a RTX 4090.
No, the 5090 is not a "professional" card. It doenst have fully enabled FP output nor does it have ECC memory or pro driver support. It DOES have 32 GB of memory, because its got a 512 bit bus and the only other option would be 16GB, which would have sent the internet into a hysterical frenzy. As a side benefit, you can tinker with LLM and AI image generation on this. Yay!
If you cant afford or dont want a 5090 that's fine, go buy a 5070 and enjoy your game library. Or a RX 9070. Or a intel B580. I swear, if people whined like this a decade ago we never would have gotten anything larger than a GTX 550ti.
So, unless someone would need to do some heavy compute work right now, and has the budget for it. I'm sorry this is just the pointless waste.
EU regulations are done after a long and comprehensive research done by very smart people based on all aspects. They will never decide to regulate 75W for a videocard. That is plain nonsense.
But they might definitely block a video card to draw more than 1KW from you power source, and that's because not of ecology related things, but mostly because of possible fire hazards, fuze blowing for sensitive buildings, etc, whatever. This is not Wild West, or Rednek Ville, where everybody can run a 1MW power plant, just because.
If the common sense is not in the vocabulary of a Corporation, then maybe is the duty of a statal committee or simmilar to impose that.
Let's see....what's $650 get you now for a current GPU? Well, after looking at my local Micro Center, nothing. It will buy you nothing for GPUs. With the new gen coming out the restock of the current gen has stopped from the looks of it. You either step up to $850+ range for a 4070Ti Super or down to the $400 range for a 7700XT.
I guess we see if Intel and AMD can really shake up the low to mid tier GPU pricing this gen. Would be nice to see things more affordable for the middle and low end, while still offering a decent performance range. For me, as it currently stands, I'm priced out of the top end GPUs by a lot.
True, but that’s more of a “game development is fucked” thing rather than “insufficient hardware” thing. I wouldn’t be surprised if a year after 5090 launches there will be games with combinations of settings that would bring THAT one to 30 frames too.
I remember 10-15 years ago, when with a x80 card I could play a 2,3 years old game with maximum settings, and 8xSSAA (best AA method ever produced).
Now?.... :)))
You say it's nonsense because YOU want a GPU with more than 75w of capability. Funny how that works. Once you start down the road of "you dont need this", it WILL be taken to its logical extreme. Remember the UK's potato peeler license debacle?
I'd love to hear what this sensible argument for a NEED of GPUs more than 75 watt to play games with is. No video card on the consumer side draws more than 1kW of power. Not even close. The highest was the 600w the 3090ti could tickle. If you're talking whole systems, buddy, SLI gaming PCS were pushing 2KW back in 2008. Somehow you all survived.
a 1kW card poses no more of a risk then a 500w card does, or a 300w card. If you have such an old building, you should be upgrading the wiring instead of buying GPUs to waste time gaming on.
Here in "rednek ville" we have modern 25 amp wiring with 20 amp breakers that can run a 5090 without burning the house down. Interesting that this is such a concern in the EU, with all those "very smart people" banning everything. Why are you even allowed to have a building with wiring that cant handle a 1000w load? That's literal knob and tube style wiring.
If you have such widespread problems with wiring, the solution is NOT to ban GPUs, because you're gonna have to ban fridges, microwaves, and vacuum cleaners too. What you SHOULD be doing is banning that old wiring. This is why the EU gets referred to as a "nanny state". The 980ti came out in 2015. Inflation is a thing. Also, to hit on @Hxx's point, I'm making about 60% more per year then I was in 2015. $850 today is $650 from 2015. I really wouldnt want to take a pay cut to 2015 levels just to get $650 GPUs back.
Also obligatory: the 8800 ultra was $850. In 2006. That's over $1200 today. Expensive top GPUs are not a new phenomena.