Monday, January 6th 2025

First NVIDIA GeForce RTX 5090 GPU with 32 GB GDDR7 Memory Leaks Ahead of CES Keynote

NVIDIA's unannounced GeForce RTX 5090 graphics card has leaked, confirming key specifications of the next-generation GPU. Thanks to exclusive information from VideoCardz, we can see the packaging of Inno3D's RTX 5090 iChill X3 model, which confirms that the graphics card will feature 32 GB of GDDR7 memory. The leaked materials show that Inno3D's variant will use a 3.5-slot cooling system, suggesting significant cooling requirements for the flagship card. According to earlier leaks, the RTX 5090 will be based on the GB202 GPU and include 21,760 CUDA cores. The card's memory system is a significant upgrade, with its 32 GB of GDDR7 memory running on a 512-bit memory bus at 28 Gbps, capable of delivering nearly 1.8 TB/s of bandwidth. This represents twice the memory capacity of the upcoming RTX 5080, which is expected to ship with 16 GB capacity but 30 Gbps GDDR7 modules.

Power consumption has increased significantly, with the RTX 5090's TDP rated at 575 W and TGP of 600 W, marking a 125-watt increase over the previous RTX 4090 in raw TDP. NVIDIA is scheduled to hold its CES keynote today at 06:30 pm PT time, where the company is expected to announce several new graphics cards officially. The lineup should include the RTX 5090, RTX 5080, RTX 5070 Ti, RTX 5070, and an RTX 5090D model specifically for the Chinese market. Early indications are that the RTX 5080 will be the first card to reach consumers, with a planned release date of January 21st. Release dates for other models, including the flagship RTX 5090, have not yet been confirmed. The RTX 5090 is currently the only card in the RTX 50 series planned to use the GB202 GPU. Pricing information and additional specifications are expected to be revealed during the upcoming announcement.
Source: VideoCardz
Add your own comment

82 Comments on First NVIDIA GeForce RTX 5090 GPU with 32 GB GDDR7 Memory Leaks Ahead of CES Keynote

#26
Dimitriman
I bet one coffee that MSRP is $2,399 for 5090 and $1,299 for 5080. To be exactly in line with, and therefore bringing ZERO gains in perf/$, a 50% faster 5090 over the 4090 and a 30% faster 5080 over the 4080S.

This is the Leather Jacket way.
Posted on Reply
#27
clopezi
DimitrimanI bet one coffee that MSRP is $2,399 for 5090 and $1,299 for 5080. To be exactly in line with, and therefore bringing ZERO gains in perf/$, a 50% faster 5090 over the 4090 and a 30% faster 5080 over the 4080S.

This is the Leather Jacket way.
30% increase over previous generation it's very good on a high end product
Posted on Reply
#28
Chomiq
I know at least one guy that will buy 5090 primarily for LLM application (business expense) and gaming as secondary. At this point the 90 series is just an entry pro level GPU.
Posted on Reply
#29
Prima.Vera
mtosevI wonder how much will the rtx 5090 cost.
Most likely 2500$ in US, 2600€ in EU, 450000¥ in JP, and simmilar value in the rest of the world.
Posted on Reply
#30
Daven
So let’s see…the 4090 is 25% faster than the 4080 with 67% more CUDA cores and 50% more bandwidth.

The 5090 has 33% more CUDA cores than the 4090 and just under double the bandwidth.

So unless games are bandwidth starved, we shouldn’t expect much over 20% performance increase. I don’t expect an increase in core clocks given the 33% rise in cores matches the 30% rise in power on the same process node.
Posted on Reply
#31
TheinsanegamerN
VinceroThe slot width is one thing... the annoying 'spread' outwards / upwards from the slot is another. There is no way (combined with plugging in the PCIe power connector that supposedly you shouldn't bend within X cm of the plug/socket) that these cards would fit a standard width case.
People (used to) worry about CPU cooler height but the GPU makers basically went "hold my beer" and just went nuts.
People demanded quieter GPUs, that requires larger fans and heatsinks. So OEMs make what sells.

Who cares about "standard" ATX cases anyway? All the new builds out there I see on social media are using modern vertical GPU mounts or are the square split design. Traditional ATX cases are very rare. Even my 2009 era HAF X has plenty of side room for these modern monsters.
Prima.VeraIt's not about a nanny state, but it's about regulations and standards. If not for the EU regulations, we would still have 100 types of USB connectors, corporations that use unrestricted personal data for own gains with absolutely no regards of their users, etc.
There is a reason why so many spyware infected Chinese companies are banned in US and EU.
Power regulation is the same. Why do you think there are regulations for housing appliances and how much power they can use?
You mean the same "power use" regulations that have made modern appliances 5 year throw away garbage? THAT's you're go to for why regulation is good?

Also, it's "efficiency", not "power use". Setting hard power limits would make large refrigerators, or freezers, or large washing machines impossible to build.
Prima.Vera3.5 slot size, 600W....things are getting out of control.
We need EU to step up and stop this nonsense, since no other Country outside EU seems not to care.
So if the EU decided that you dont need more then 75w for gaming, since it is an entirely wasteful hobby, how hard would you have to force yourself to smile while buying a $1000 RX 6400? Just curious.
sephiroth117They are supposed to make a GAMING gpus lineup.

High-end workstations/AI should have been for the Quadro lineup

This 5090 with 32GB is not really for gaming per se, this is again for AI, no game requires 32GB at 4k and the price will suffer for people who wanted this just for gaming

if DLSS4 required more vRam buffers than the 5080 surely would have had more than 16GB..

so yes once again,..this is an AI company now, given how much billions it generates I understand it but I’m in the market for a high-end GAMING gpu
And they do. You can buy anything from a RTX 4060 to a RTX 4090.

No, the 5090 is not a "professional" card. It doenst have fully enabled FP output nor does it have ECC memory or pro driver support. It DOES have 32 GB of memory, because its got a 512 bit bus and the only other option would be 16GB, which would have sent the internet into a hysterical frenzy. As a side benefit, you can tinker with LLM and AI image generation on this. Yay!

If you cant afford or dont want a 5090 that's fine, go buy a 5070 and enjoy your game library. Or a RX 9070. Or a intel B580. I swear, if people whined like this a decade ago we never would have gotten anything larger than a GTX 550ti.
Posted on Reply
#32
Daven
DimitrimanI bet one coffee that MSRP is $2,399 for 5090 and $1,299 for 5080. To be exactly in line with, and therefore bringing ZERO gains in perf/$, a 50% faster 5090 over the 4090 and a 30% faster 5080 over the 4080S.

This is the Leather Jacket way.
How do you get a 50% increase in performance between 5090 and 4090 with only a 33% increase in CUDA cores when the 4090 only beat the 4080 by 25% with a 67% increase in cores?
Posted on Reply
#33
TheinsanegamerN
DavenHow do you get a 50% increase in performance between 5090 and 4090 with only a 33% increase in CUDA cores when the 4090 only beat the 4080 by 25% with a 67% increase in cores?
Because doubling of bandwidth over the 4090, if you have that many more cores and little progress that points to memory starvation. that's likely why Nvidia went with a (very expensive) 512 bit bus and GDDR7 combined to double actual bandwidth over the 4090, despite only having 33% more cores.
Posted on Reply
#34
Random_User
Bomby569only poor people like us talk trash about it, we'd all buy it if we could :D

that's said it's stupid and too expensive
Yes! If the user cannot utilize the device 100%, right now, during the purchase, most likely one will end up not using it for a purpose ever. Buy what you are need it for. If one does only gaming/entertainment, and some easy tasks- general multipurpose "gaming" mid/higher mid end, is all what people need. By the time the next gen graphics/games will launch, the technology in the cards would be obsolete, like the silicon itself, that is planned to be obsolete very soon.
So, unless someone would need to do some heavy compute work right now, and has the budget for it. I'm sorry this is just the pointless waste.
Posted on Reply
#35
RedelZaVedno
I have no problem with 5090 placing and pricing. Nvidia always had Titan series which costed around 2,999 USD for professionals who didn't need Quadro driver's license. But I do have a problem with xx80 class GPU performance Vs 5090. Titans used to be less than 20% faster in rasterizarion vs xx80 class cards now the difference between 5080 and 5090 will likely be 50% or more. It might even happen that 5090 has similar "cost per frame" metric than 5080 at least at MSRP pricing. I mean if 5080 is priced at $1200 and 5090 $2000, where the added value in buying 5080 vs 5090? That is totally unacceptable imho:mad:
Posted on Reply
#36
shoman24v
Can I buy this and never need a GPU for the next 10 years?
Posted on Reply
#37
RedelZaVedno
shoman24vCan I buy this and never need a GPU for the next 10 years?
5090 is a beast. You'd probably be OK for the next 3 generations, so 6,7 years.
Posted on Reply
#38
Daven
TheinsanegamerNBecause doubling of bandwidth over the 4090, if you have that many more cores and little progress that points to memory starvation. that's likely why Nvidia went with a (very expensive) 512 bit bus and GDDR7 combined to double actual bandwidth over the 4090, despite only having 33% more cores.
I will admit it if I’m wrong but there is no way these leak specs point to a 50% increase over the 4090. It will be 30% at best averaged over the 27 games in the TPU test suite.
Posted on Reply
#39
Prima.Vera
TheinsanegamerNYou mean the same "power use" regulations that have made modern appliances 5 year throw away garbage? THAT's you're go to for why regulation is good?

So if the EU decided that you dont need more then 75w for gaming, since it is an entirely wasteful hobby, how hard would you have to force yourself to smile while buying a $1000 RX 6400? Just curious.
You're just talking nonsense.
EU regulations are done after a long and comprehensive research done by very smart people based on all aspects. They will never decide to regulate 75W for a videocard. That is plain nonsense.
But they might definitely block a video card to draw more than 1KW from you power source, and that's because not of ecology related things, but mostly because of possible fire hazards, fuze blowing for sensitive buildings, etc, whatever. This is not Wild West, or Rednek Ville, where everybody can run a 1MW power plant, just because.
If the common sense is not in the vocabulary of a Corporation, then maybe is the duty of a statal committee or simmilar to impose that.
Posted on Reply
#40
neatfeatguy
I miss the days that a top end GPU was $650 and if you wanted the Titan brand card you'd pay more, but that 980Ti....oh how I miss those days.

Let's see....what's $650 get you now for a current GPU? Well, after looking at my local Micro Center, nothing. It will buy you nothing for GPUs. With the new gen coming out the restock of the current gen has stopped from the looks of it. You either step up to $850+ range for a 4070Ti Super or down to the $400 range for a 7700XT.

I guess we see if Intel and AMD can really shake up the low to mid tier GPU pricing this gen. Would be nice to see things more affordable for the middle and low end, while still offering a decent performance range. For me, as it currently stands, I'm priced out of the top end GPUs by a lot.
Posted on Reply
#41
Prima.Vera
RedelZaVedno5090 is a beast. You'd probably be OK for the next 3 generations, so 6,7 years.
There are already a lot of games that cannot even output 30 fps on ultra settings, when all quality options are used, on a 4090 card.
Posted on Reply
#42
Onasi
@Prima.Vera
True, but that’s more of a “game development is fucked” thing rather than “insufficient hardware” thing. I wouldn’t be surprised if a year after 5090 launches there will be games with combinations of settings that would bring THAT one to 30 frames too.
Posted on Reply
#43
Prima.Vera
Onasi@Prima.Vera
True, but that’s more of a “game development is fucked” thing rather than “insufficient hardware” thing. I wouldn’t be surprised if a year after 5090 launches there will be games with combinations of settings that would bring THAT one to 30 frames too.
100% correct.
I remember 10-15 years ago, when with a x80 card I could play a 2,3 years old game with maximum settings, and 8xSSAA (best AA method ever produced).
Now?.... :)))
Posted on Reply
#44
Hxx
neatfeatguyI miss the days that a top end GPU was $650 and if you wanted the Titan brand card you'd pay more, but that 980Ti....oh how I miss those days.

Let's see....what's $650 get you now for a current GPU? Well, after looking at my local Micro Center, nothing. It will buy you nothing for GPUs. With the new gen coming out the restock of the current gen has stopped from the looks of it. You either step up to $850+ range for a 4070Ti Super or down to the $400 range for a 7700XT.

I guess we see if Intel and AMD can really shake up the low to mid tier GPU pricing this gen. Would be nice to see things more affordable for the middle and low end, while still offering a decent performance range. For me, as it currently stands, I'm priced out of the top end GPUs by a lot.
Well I’m hoping you also make more money now than you did 10 years ago because cost of living went up. You can grab a mid range 7800 xt for about $500 or something similar and play plenty of new titles . Just because there will be a $3K card doesn’t mean it’s the only way to enjoy games
Posted on Reply
#45
TheinsanegamerN
DavenI will admit it if I’m wrong but there is no way these leak specs point to a 50% increase over the 4090. It will be 30% at best averaged over the 27 games in the TPU test suite.
I'll take that bet. I bet it'll hit 50%.
Prima.VeraYou're just talking nonsense.
EU regulations are done after a long and comprehensive research done by very smart people based on all aspects. They will never decide to regulate 75W for a videocard. That is plain nonsense.
Why is that nonsense? Gaming is a wasteful hobby. Why do you need 150 watt to play bing bing wahoo? Hell why do you need 75 watt? The switch only pulls like 10 at load, that's all you need for video games.

You say it's nonsense because YOU want a GPU with more than 75w of capability. Funny how that works. Once you start down the road of "you dont need this", it WILL be taken to its logical extreme. Remember the UK's potato peeler license debacle?

I'd love to hear what this sensible argument for a NEED of GPUs more than 75 watt to play games with is.
Prima.VeraBut they might definitely block a video card to draw more than 1KW from you power source, and that's because not of ecology related things, but mostly because of possible fire hazards, fuze blowing for sensitive buildings, etc, whatever. This is not Wild West, or Rednek Ville, where everybody can run a 1MW power plant, just because.
If the common sense is not in the vocabulary of a Corporation, then maybe is the duty of a statal committee or simmilar to impose that.
No video card on the consumer side draws more than 1kW of power. Not even close. The highest was the 600w the 3090ti could tickle. If you're talking whole systems, buddy, SLI gaming PCS were pushing 2KW back in 2008. Somehow you all survived.

a 1kW card poses no more of a risk then a 500w card does, or a 300w card. If you have such an old building, you should be upgrading the wiring instead of buying GPUs to waste time gaming on.

Here in "rednek ville" we have modern 25 amp wiring with 20 amp breakers that can run a 5090 without burning the house down. Interesting that this is such a concern in the EU, with all those "very smart people" banning everything. Why are you even allowed to have a building with wiring that cant handle a 1000w load? That's literal knob and tube style wiring.

If you have such widespread problems with wiring, the solution is NOT to ban GPUs, because you're gonna have to ban fridges, microwaves, and vacuum cleaners too. What you SHOULD be doing is banning that old wiring. This is why the EU gets referred to as a "nanny state".
neatfeatguyI miss the days that a top end GPU was $650 and if you wanted the Titan brand card you'd pay more, but that 980Ti....oh how I miss those days.

Let's see....what's $650 get you now for a current GPU? Well, after looking at my local Micro Center, nothing. It will buy you nothing for GPUs. With the new gen coming out the restock of the current gen has stopped from the looks of it. You either step up to $850+ range for a 4070Ti Super or down to the $400 range for a 7700XT.

I guess we see if Intel and AMD can really shake up the low to mid tier GPU pricing this gen. Would be nice to see things more affordable for the middle and low end, while still offering a decent performance range. For me, as it currently stands, I'm priced out of the top end GPUs by a lot.
The 980ti came out in 2015. Inflation is a thing. Also, to hit on @Hxx's point, I'm making about 60% more per year then I was in 2015. $850 today is $650 from 2015. I really wouldnt want to take a pay cut to 2015 levels just to get $650 GPUs back.

Also obligatory: the 8800 ultra was $850. In 2006. That's over $1200 today. Expensive top GPUs are not a new phenomena.
Posted on Reply
#46
Crazybc
I work with a guy and he's buying one. I have no idea why but he says he needs one for his gamming rig . I just roll my eyes because I'll be happy with a 5080 24 GB card if I dont have to sell any body parts to get one. Plus you have what 25 watts left on that if the specs are right for power consumption for over clocking . No thanks and the price? . I thought about not checking the prices till I'm at work standing beside the automatic defibrillators that we have out on the floor.
Posted on Reply
#47
Daven
TheinsanegamerNI'll take that bet. I bet it'll hit 50%.
We’ll at least know what Nvidia thinks tonight.
Posted on Reply
#48
TheinsanegamerN
DavenWe’ll at least know what Nvidia thinks tonight.
Are we basing this bet on Nvidia's numbers, or on TPUs? If TPUs when does the card actually release?
Posted on Reply
#49
Hxx
CrazybcI work with a guy and he's buying one. I have no idea why but he says he needs one for his gamming rig . I just roll my eyes because I'll be happy with a 5080 24 GB card if I dont have to sell any body parts to get one. Plus you have what 25 watts left on that if the specs are right for power consumption for over clocking . No thanks and the price? . I thought about not checking the prices till I'm at work standing beside the automatic defibrillators that we have out on the floor.
They are going to be very expensive and not worth the price tag. You can figure that out without even watching their stupid keynote and reading benchmarks . High end stuff like this is almost never “worth it” or justifiable for its price. You either buy it or don’t . You are walking into a dealership and buying a highly anticipated performance vehicle . There is no special financing and guess what there are no incentives you’ll be lucky if they have one in the lot that hasn’t been spoken for and can’t be picky about color .
Posted on Reply
#50
Darc Requiem
Onasi@Prima.Vera
True, but that’s more of a “game development is fucked” thing rather than “insufficient hardware” thing. I wouldn’t be surprised if a year after 5090 launches there will be games with combinations of settings that would bring THAT one to 30 frames too.
It's a little of both. Engines like UE5 seem to be more resource heavy than they should be. That said hardware, as it stands, is not really ready for raytracing/path tracing. Nvidia has developed a whole software suite to mask the fact that a $1600 GPU that is other wise capable of 4K high refresh gaming is running heavily raytraced games at 1080p 40ish fps. But with DLSS and Frame Generation It "looks like 4K and 80fps".
Posted on Reply
Add your own comment
Jan 8th, 2025 13:33 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts