Monday, January 6th 2025

First NVIDIA GeForce RTX 5090 GPU with 32 GB GDDR7 Memory Leaks Ahead of CES Keynote

NVIDIA's unannounced GeForce RTX 5090 graphics card has leaked, confirming key specifications of the next-generation GPU. Thanks to exclusive information from VideoCardz, we can see the packaging of Inno3D's RTX 5090 iChill X3 model, which confirms that the graphics card will feature 32 GB of GDDR7 memory. The leaked materials show that Inno3D's variant will use a 3.5-slot cooling system, suggesting significant cooling requirements for the flagship card. According to earlier leaks, the RTX 5090 will be based on the GB202 GPU and include 21,760 CUDA cores. The card's memory system is a significant upgrade, with its 32 GB of GDDR7 memory running on a 512-bit memory bus at 28 Gbps, capable of delivering nearly 1.8 TB/s of bandwidth. This represents twice the memory capacity of the upcoming RTX 5080, which is expected to ship with 16 GB capacity but 30 Gbps GDDR7 modules.

Power consumption has increased significantly, with the RTX 5090's TDP rated at 575 W and TGP of 600 W, marking a 125-watt increase over the previous RTX 4090 in raw TDP. NVIDIA is scheduled to hold its CES keynote today at 06:30 pm PT time, where the company is expected to announce several new graphics cards officially. The lineup should include the RTX 5090, RTX 5080, RTX 5070 Ti, RTX 5070, and an RTX 5090D model specifically for the Chinese market. Early indications are that the RTX 5080 will be the first card to reach consumers, with a planned release date of January 21st. Release dates for other models, including the flagship RTX 5090, have not yet been confirmed. The RTX 5090 is currently the only card in the RTX 50 series planned to use the GB202 GPU. Pricing information and additional specifications are expected to be revealed during the upcoming announcement.
Source: VideoCardz
Add your own comment

81 Comments on First NVIDIA GeForce RTX 5090 GPU with 32 GB GDDR7 Memory Leaks Ahead of CES Keynote

#1
_roman_
Finally 32GiB VRAM with GDDR7
Posted on Reply
#2
Bomby569
that's a lot of vram and a lot of watts. Does any game use 24GB vram even in 4K, at settings that don't make it a slide show?
Posted on Reply
#3
csendesmark
Bomby569that's a lot of vram and a lot of watts. Does any game use 24GB vram even in 4K, at settings that don't make it a slide show?
This is not for "current" games,
The greater VRAM is for AI related stuff.
Wish that I could afford this :love:
Posted on Reply
#4
Bomby569
csendesmarkWish that I could afford this :love:
only poor people like us talk trash about it, we'd all buy it if we could :D

that's said it's stupid and too expensive
Posted on Reply
#5
clopezi
Bomby569that's a lot of vram and a lot of watts. Does any game use 24GB vram even in 4K, at settings that don't make it a slide show?
You can use for games, and Nvidia know it, but it's not designed with "gamers" in mind, but professionals
Posted on Reply
#6
csendesmark
Bomby569only poor people like us talk trash about it, we'd all buy it if we could :D

that's said it's stupid and too expensive
I did not talk trash about the card, Might talk trash about the company for it's pricing methods. :D
What I whish is to have 5080 cards with only 2,5 slot thickness or less, not even dreaming about a "lean" 5090.
Posted on Reply
#7
Vincero
Power consumption has increased significantly, with the RTX 5090's TDP rated at 575 W and TGP of 600 W, marking a 125-watt increase over the previous RTX 4090 in raw TDP
Interesting to see how much of a performance boost any core improvements yield clock-for-clock speed combined with that 25% higher TDP headroom.
If it's not more than 30% faster generally over the outgoing 4090, technically essentially it's a bit of a waste of time.
Posted on Reply
#8
sephiroth117
They are supposed to make a GAMING gpus lineup.

High-end workstations/AI should have been for the Quadro lineup

This 5090 with 32GB is not really for gaming per se, this is again for AI, no game requires 32GB at 4k and the price will suffer for people who wanted this just for gaming

if DLSS4 required more vRam buffers than the 5080 surely would have had more than 16GB..

so yes once again,..this is an AI company now, given how much billions it generates I understand it but I’m in the market for a high-end GAMING gpu
Posted on Reply
#9
Bomby569
sephiroth117They are supposed to make a GAMING gpus lineup.

High-end workstations/AI should have been for the Quadro lineup

This 5090 with 32GB is not really for gaming per se, this is again for AI, no game requires 32GB at 4k and the price will suffer for people who wanted this just for gaming

if DLSS4 required more vRam buffers than the 5080 surely would have had more than 16GB..

so yes once again,..this is an AI company now, given how much billions it generates I understand it but I’m in the market for a high-end GAMING gpu
this right here.
They have gaming as a afterthought at this point, if they could they would stop selling just gaming gpus probably. They shitify the low end with low vram and insane prices and try to move us all to spend more and more, and if the AI crowd could snatch all the high end even better so they can ask more and more money with each generation.
Posted on Reply
#11
Legacy-ZA
clopeziYou can use for games, and Nvidia know it, but it's not designed with "gamers" in mind, but professionals
No, that is the Quadro line, go be a "professional" in your own product line. ;)
Posted on Reply
#12
Prima.Vera
3.5 slot size, 600W....things are getting out of control.
We need EU to step up and stop this nonsense, since no other Country outside EU seems not to care.
Posted on Reply
#13
Bomby569
Prima.Vera3.5 slot size, 600W....things are getting out of control.
We need EU to step up and stop this nonsense, since no other Country outside EU seems not to care.
nanny state! Don't want it don't buy it, why does making it illegal seems like a good idea at all.
Is it because of power, the people that travel in private jets will stop other people from using whatever gpu they want?
Posted on Reply
#14
clopezi
Offtopic, but I live in Spain and the 100% of my energy consumption it's 100% renewable. It's the standard in our homes. More than 50% of the energy consumed in our country it's renewable.

In a few years, electric consumption will not be a problem for the environment. However, CO2 it's and will be. The same way, water consumption in datacenters it's a problem. But not 600W in a home GPU.
Posted on Reply
#15
Vincero
The slot width is one thing... the annoying 'spread' outwards / upwards from the slot is another. There is no way (combined with plugging in the PCIe power connector that supposedly you shouldn't bend within X cm of the plug/socket) that these cards would fit a standard width case.
People (used to) worry about CPU cooler height but the GPU makers basically went "hold my beer" and just went nuts.
Posted on Reply
#16
Tomgang
Uf 575 watt. I all ready think my 4090's 450 watt is high enough.

It goes in the wrong direction about power consumption for gpu's. I hope performance are there after.
Posted on Reply
#17
Vincero
clopeziOfftopic, but I live in Spain and the 100% of my energy consumption it's 100% renewable. It's the standard in our homes. More than 50% of the energy consumed in our country it's renewable.

In a few years, electric consumption will not be a problem for the environment. However, CO2 it's and will be. The same way, water consumption in datacenters it's a problem. But not 600W in a home GPU.
Unless you're not using the energy grid and instead have a specific localised set up with no energy import, that's highly unlikely.
Your 'provider' company may generate from renewables but what gets put into the grid and taken out by everyone connected will be a mix, e.g.:



A bit shy of 50%...

www.iea.org/countries/spain/electricity
Posted on Reply
#18
Bomby569
VinceroThe slot width is one thing... the annoying 'spread' outwards / upwards from the slot is another. There is no way (combined with plugging in the PCIe power connector that supposedly you shouldn't bend within X cm of the plug/socket) that these cards would fit a standard width case.
People (used to) worry about CPU cooler height but the GPU makers basically went "hold my beer" and just went nuts.
there is always a ventus
VinceroUnless you're not using the energy grid and instead have a specific localised set up with no energy import, that's highly unlikely.
Your 'provider' company may generate from renewables but what gets put into the grid and taken out by everyone connected will be a mix, e.g.:



A bit shy of 50%...

www.iea.org/countries/spain/electricity
nuclear aren't technically renewable, but they also shouldn't count in the same category as oil and gas, that makes no sense. 25% of fossil fuels is very good.
Posted on Reply
#19
JustBenching
32gb is a bit excessive for gamers. I can see this hitting 1999$ msrp.
Posted on Reply
#20
Vincero
Bomby569there is always a ventus
Unforunately I don't think that will stop the spread. The Nvidia reference designs basically encourage what I would say is bad (due to not really conforming) design. Coupled with requiring the 12V 6x2 connector and having it point straigh out/up away from the socket whilst knowing it's mechanical limitations is an additional two fingers up to the consumer.

Combine this with the push by some for BTF motherboards necessitating a thicker area behind the motherboard and cases are gonna become more cube shaped.
Bomby569nuclear aren't technically renewable, but they also shouldn't count in the same category as oil and gas, that makes no sense. 25% of fossil fuels is very good.
The chart is just the overall mix - not renewable or fossil specific. Guy claimed it was 50% - it isn't. Nuclear isn't renewable but also doesn't count towards CO2 emissions (technically).
Posted on Reply
#21
mb194dc
Seems like the 5090 will be massively expensive and probably flop for gamers due to the price, which is rumoured around 2.5k ish. Like a Titan card.

Maybe there's demand still to put it in LLM, but still seems like there's few revenue generating things that can be done with those. Plus you can rent GPU time for LLM from datacenters anyway and prices been falling like a rock for that...
Posted on Reply
#22
SOAREVERSOR
sephiroth117They are supposed to make a GAMING gpus lineup.

High-end workstations/AI should have been for the Quadro lineup

This 5090 with 32GB is not really for gaming per se, this is again for AI, no game requires 32GB at 4k and the price will suffer for people who wanted this just for gaming

if DLSS4 required more vRam buffers than the 5080 surely would have had more than 16GB..

so yes once again,..this is an AI company now, given how much billions it generates I understand it but I’m in the market for a high-end GAMING gpu
That would be the RTX 5080. The X090 are the new Titans. They are for prosumers. The specs and everything are souped up for that market. The majority of these cards sold will never run a game ever. They are put in racks or workstations with 2-8 of them at a time and run ML, DL, AI or anything CUDA related. They won't ever run a game. Even in single configuration there's a reason the Geforce Creator drivers exist and it's for these cards or someone who just wants to learn CUDA.

Quadro is not what you think it is. Quadro is specifically for programs (CAD, CAM mostly) that require the Quadro drivers and certification. It's a seperate market from CUDA tasks. The super high end CUDA market are the accelerators way above all this that few companies can even afford.

Nvidia stopped being a gaming company when they released the 8800 GTX and CUDA. They have stated this. Repeatedly. You keep not getting that.

The high end GAMING gpu is the RTX 5080. Period. Anything higher than that you are buying a PROFESSIONAL gpu and not really using it and setting money on fire to game on it.
Legacy-ZANo, that is the Quadro line, go be a "professional" in your own product line. ;)
That's only for products that require the Quadro drivers such as CAD, CAM. Titan and X090 are prosumer for CUDA based stuff and nvidia does have Creator drivers for the Geforce line as well. Nvidia has been very clear about it. They are not a graphics company since the 8800 GTX and CUDA. These super high end cards are CUDA products and not gaming products. But if you have the money you can game on them.

That gamers don't want to believe what has been true for over a decade now and has been repeatedly stated by nvidia is just gamer syndrome at it's finest. Serve it up with a cup of crying about women in video games not being hentai'd up and you have the perfect cocktail.
Posted on Reply
#23
Vincero
SOAREVERSORThe majority of these cards sold will never run a game ever. They are put in racks or workstations with 2-8 of them at a time and run ML, DL, AI or anything CUDA related. They won't ever run a game. Even in single configuration there's a reason the Geforce Creator drivers exist and it's for these cards or someone who just wants to learn CUDA.
Wasn't that what the 'Tesla' range of cards were for... (then A/H/B)?

The dilution of features between the different product ranges is great for tinkerers and people developing on a budget, but has been detrimental to some product segments. I'm surprised that Nvidia haven't rectified this, not from some sort of attempt to maintain supply or stimulate sales at lower prices, but purely to maintain profit on selling higher end / higher feature parts for higher prices / profit margins.
Posted on Reply
#24
Prima.Vera
Bomby569nanny state! Don't want it don't buy it, why does making it illegal seems like a good idea at all.
Is it because of power, the people that travel in private jets will stop other people from using whatever gpu they want?
It's not about a nanny state, but it's about regulations and standards. If not for the EU regulations, we would still have 100 types of USB connectors, corporations that use unrestricted personal data for own gains with absolutely no regards of their users, etc.
There is a reason why so many spyware infected Chinese companies are banned in US and EU.
Power regulation is the same. Why do you think there are regulations for housing appliances and how much power they can use?
Posted on Reply
#25
mtosev
I wonder how much will the rtx 5090 cost.
Posted on Reply
Add your own comment
Jan 7th, 2025 16:28 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts