Monday, January 6th 2025

First NVIDIA GeForce RTX 5090 GPU with 32 GB GDDR7 Memory Leaks Ahead of CES Keynote

NVIDIA's unannounced GeForce RTX 5090 graphics card has leaked, confirming key specifications of the next-generation GPU. Thanks to exclusive information from VideoCardz, we can see the packaging of Inno3D's RTX 5090 iChill X3 model, which confirms that the graphics card will feature 32 GB of GDDR7 memory. The leaked materials show that Inno3D's variant will use a 3.5-slot cooling system, suggesting significant cooling requirements for the flagship card. According to earlier leaks, the RTX 5090 will be based on the GB202 GPU and include 21,760 CUDA cores. The card's memory system is a significant upgrade, with its 32 GB of GDDR7 memory running on a 512-bit memory bus at 28 Gbps, capable of delivering nearly 1.8 TB/s of bandwidth. This represents twice the memory capacity of the upcoming RTX 5080, which is expected to ship with 16 GB capacity but 30 Gbps GDDR7 modules.

Power consumption has increased significantly, with the RTX 5090's TDP rated at 575 W and TGP of 600 W, marking a 125-watt increase over the previous RTX 4090 in raw TDP. NVIDIA is scheduled to hold its CES keynote today at 06:30 pm PT time, where the company is expected to announce several new graphics cards officially. The lineup should include the RTX 5090, RTX 5080, RTX 5070 Ti, RTX 5070, and an RTX 5090D model specifically for the Chinese market. Early indications are that the RTX 5080 will be the first card to reach consumers, with a planned release date of January 21st. Release dates for other models, including the flagship RTX 5090, have not yet been confirmed. The RTX 5090 is currently the only card in the RTX 50 series planned to use the GB202 GPU. Pricing information and additional specifications are expected to be revealed during the upcoming announcement.
Source: VideoCardz
Add your own comment

82 Comments on First NVIDIA GeForce RTX 5090 GPU with 32 GB GDDR7 Memory Leaks Ahead of CES Keynote

#51
TheinsanegamerN
Darc RequiemIt's a little of both. Engines like UE5 seem to be more resource heavy than they should be. That said hardware, as it stands, is not really ready for raytracing/path tracing. Nvidia has developed a whole software suite to mask the fact that a $1600 GPU that is other wise capable of 4K high refresh gaming is running heavily raytraced games at 1080p 40ish fps. But with DLSS and Frame Generation It "looks like 4K and 80fps".
If it that they cant handle raytracing, or is it that game devs cant handle raytracing? Remember when tessellation was the hot buzzword and devs were just leaving an entire plane of water in existence chewing up resources?

Actually properly optimizing lighting is hard, optimizing in general is hard, that's why we've seen devs just not doing it. the same devs that keep cutting features and delivering half baked ideas (unless its the cash shop).

EDIT: as a prime example. cities skylines 2. Every sim/zim/whatever they call them had fully modeled individual 3d teeth, with multiple surfaces. WHY? ITS A CITY BUILDER! Dumb stuff like that is absolutely everywhere. I'd be more curious at something like the new DOOM using RT on idtech to see what is actually possible, but with carmack gone, who knows if its still got the same level of optimization.
Posted on Reply
#52
Crazybc
HxxThey are going to be very expensive and not worth the price tag. You can figure that out without even watching their stupid keynote and reading benchmarks . High end stuff like this is almost never “worth it” or justifiable for its price. You either buy it or don’t . You are walking into a dealership and buying a highly anticipated performance vehicle . There is no special financing and guess what there are no incentives you’ll be lucky if they have one in the lot that hasn’t been spoken for and can’t be picky about color .
i Have zero interest in a 5090 maybe a 5080 or 5070 but as I see it a 5090 is just for bragging rights at the moment. Even 4090,s new are selling here right now for around 3 K That is simply not happening as there's no way I could ever justify it.
Even to myself.
Posted on Reply
#53
Hakker
VinceroUnless you're not using the energy grid and instead have a specific localised set up with no energy import, that's highly unlikely.
Your 'provider' company may generate from renewables but what gets put into the grid and taken out by everyone connected will be a mix, e.g.:



A bit shy of 50%...

www.iea.org/countries/spain/electricity
Actually well over 50%. Nuclear is actually the cleanest power of them all. Sure the waste is toxic most notable one being caesium, but in France there is a lot of reprocessing and in the end 0.2% of all Nuclear waste there is long term toxic waste. That makes it even a lot cleaner than Windmills and Solar powered energy. If you want clean energy you cannot even go around nuclear power because unlike windmill farms or solar parks you can easily regulate it's power output.
Coals, oil and natural gas are the ones you should look at as bad energy sources.
Posted on Reply
#54
Legacy-ZA
DavenWe’ll at least know what Nvidia thinks tonight.
It's just the keynotes, not the showcase, that is tomorrow, well, at least for me +2GMT
Posted on Reply
#55
TheinsanegamerN
HakkerActually well over 50%. Nuclear is actually the cleanest power of them all. Sure the waste is toxic most notable one being caesium, but in France there is a lot of reprocessing and in the end 0.2% of all Nuclear waste there is long term toxic waste. That makes it even a lot cleaner than Windmills and Solar powered energy. If you want clean energy you cannot even go around nuclear power because unlike windmill farms or solar parks you can easily regulate it's power output.
Coals, oil and natural gas are the ones you should look at as bad energy sources.
It's not considered green because to get the urnaium, you still either need to do conventional mining techniques, AKA open pit mining, or in situ leaching when there's a lot of water. Either way, the mining and milling process to make the fuel rods produces a LOT of radioactive waste, often more then the used ocres themselves.

that's why nuclear power is often in its own category.

France is the best case example for nuclear power and how we could have fixed our many issues decades ago, if people didnt panic over the word "radiation".
Posted on Reply
#56
Vincero
HakkerActually well over 50%. Nuclear is actually the cleanest power of them all. Sure the waste is toxic most notable one being caesium, but in France there is a lot of reprocessing and in the end 0.2% of all Nuclear waste there is long term toxic waste. That makes it even a lot cleaner than Windmills and Solar powered energy. If you want clean energy you cannot even go around nuclear power because unlike windmill farms or solar parks you can easily regulate it's power output.
Coals, oil and natural gas are the ones you should look at as bad energy sources.
That doesn't make Nuclear a 'renewable' which was the original claim / point.
It can get lumped in the 0% CO2 bracket (in terms of operating emissions), but that's it... it hasn't really got any 'green' credentials in terms of environmental impact.
Posted on Reply
#57
TheinsanegamerN
VinceroThat doesn't make Nuclear a 'renewable' which was the original claim / point.
It can get lumped in the 0% CO2 bracket (in terms of operating emissions), but that's it... it hasn't really got any 'green' credentials in terms of environmental impact.
I mean, that's debatable. Nuke plants glow pretty "green after all : )

In seriousness, nuke plants dont kill birds like windmills do, nor do they require the huge clearing of large solar farms, nor do they bake the soil like solar can do. The production of solar still produces lots of toxic by products which get dumped into rivers and the fluid from turbines still has no recourse once used up and easily pollutes water tables.

Back to the topic, you will need your own mini nuke for your 5090. I'm all for it, where do I get my complementary Chirkov sticker?
Posted on Reply
#58
RedelZaVedno
Prima.VeraThere are already a lot of games that cannot even output 30 fps on ultra settings, when all quality options are used, on a 4090 card.
No matter how great harware you throw at the shitty code, it will still run like shit. You can't solve game developer's incompetence with better hardware. Well you can to some extend, but you sooner or later hit the limits of bad code.
Posted on Reply
#59
neatfeatguy
TheinsanegamerNThe 980ti came out in 2015. Inflation is a thing. Also, to hit on @Hxx's point, I'm making about 60% more per year then I was in 2015. $850 today is $650 from 2015. I really wouldnt want to take a pay cut to 2015 levels just to get $650 GPUs back.

Also obligatory: the 8800 ultra was $850. In 2006. That's over $1200 today. Expensive top GPUs are not a new phenomena.
Hate when people say "inflation is a thing". Also, no one cares how much more you make over said time back then. Good on that you do.

Even at $850 you still only get a 4070Ti Super. No thanks. That still isn't a top end GPU. Sure, it gets you close, but isn't top end. 4080 would be low-high end where as the 4090 would be top high-end. 4070 would be low-mid to the Ti Super being high-mid.

So if you want to use inflation back when the 980Ti was $650. The 970 was a mid tier card back then and sold for $330. With inflation for today's price you would be spending $445. What does $445 buy you for a GPU in this current gen? A 4060 Ti.....so you now went from a mid range card in 2015 at $330 to an entry level card (let's be honest, anything under a 4060 isn't really a good buy for a GPU at these prices) to an entry level card for $445 because you can't get a mid-ranged RTX 4070 since it is $600.
Posted on Reply
#60
Bomby569
There is "inflation" and there is "nvidia inflation", stop using the wrong one please. :D
Posted on Reply
#61
Vincero
Ideally the increase in price and inflation would have been eaten by process improvements making the tech cheaper... none of that is really the case anymore.

You want the latest x nm process tech... you's got to pay...

It's quite amusing that Apple (who have a history of playing hardball with suppliers on costs) are always so willing to pay the premium charged by TSMC for it...
Posted on Reply
#62
HeadRusch1
You look at this and then you look at the last 2 years worth of game releases you probably haven't bought and played because they're hot garbage and/or not yet free on Epic.....
Posted on Reply
#63
TheinsanegamerN
neatfeatguyHate when people say "inflation is a thing". Also, no one cares how much more you make over said time back then. Good on that you do.
I'm sorry you hate reality. In just the last 4 years the money supply of most countries has expanded by 50-100%. It is actually crazy to think that, with a doubling of the money supply, entry level burger flippers going from $7.49 an hour to almost $20, and normal office workers going from $50k a year to $80k+, that the price of hardware should just stay the same.
neatfeatguyEven at $850 you still only get a 4070Ti Super. No thanks. That still isn't a top end GPU. Sure, it gets you close, but isn't top end. 4080 would be low-high end where as the 4090 would be top high-end. 4070 would be low-mid to the Ti Super being high-mid.
So, I see the problem. you've got categories totally screwed up. the 4070 is a low-mid card? ROFL no. That's the 4060. The 4070 was an upper mid card, the 4080 was the high end, and the 4090 was the silly halo card. The 4070ti would be the low-high end card, even if not the flagship the 4070ti is a VERY capable card.
neatfeatguySo if you want to use inflation back when the 980Ti was $650. The 970 was a mid tier card back then and sold for $330. With inflation for today's price you would be spending $445. What does $445 buy you for a GPU in this current gen? A 4060 Ti.....so you now went from a mid range card in 2015 at $330 to an entry level card (let's be honest, anything under a 4060 isn't really a good buy for a GPU at these prices) to an entry level card for $445 because you can't get a mid-ranged RTX 4070 since it is $600.
The 4060ti isnt an entry level card either. The 4060 is low end and the 3050 is technically your entry level. I'd never consider one, I'd consider the 4060 as the entry level.

There's other things to look at as well. Maxwell was built on a cheaper node. The 28nm TSMC node cost about $5000 back in 2014. Ada's node was around $20,000. so that's 4x the silicon costs. GDDR6/X memory is also significantly more expensive than GDDR5 was, the coolers are bigger (and quieter), ece. So yeah, the 4070 is about 104mm2 smaller than the 970, but it costs likely twice to three times as much to actually make then maxwell did. Do you expect nvidia to eat a loss there, so you can buy one for $330?
Bomby569There is "inflation" and there is "nvidia inflation", stop using the wrong one please. :D
*sigh*

Nvidia's margins during the height of maxwell: 55-57%
Nvidia's margins during peak COVID boom (on the much cheaper samsung 8nm node): 65%
Nvidia's margins during the height of Ada: 56%

Can we let this misinformation die already? I know it's easy to just blame everything on NgReEdIa but the reality is there's more to the world then Nvidia. The onyl reason their margins are so high right now is the AI GPU boom. If they were still primarily Geforce they'd still be about the same margin level.
Posted on Reply
#64
Bomby569
TheinsanegamerNNvidia's margins during the height of maxwell: 55-57%
Nvidia's margins during peak COVID boom (on the much cheaper samsung 8nm node): 65%
Nvidia's margins during the height of Ada: 56%

Can we let this misinformation die already? I know it's easy to just blame everything on NgReEdIa but the reality is there's more to the world then Nvidia. The onyl reason their margins are so high right now is the AI GPU boom. If they were still primarily Geforce they'd still be about the same margin level.
were did you get those numbers from?
and i'm not sure you are making complete sense, the covid had higher margins, now the margin are the same as in the maxwell era but you claim margins are "so high right now because of AI" :wtf:
Posted on Reply
#65
Onasi
Bomby569and i'm not sure you are making complete sense, the covid had higher margins, now the margin are the same as in the maxwell era but you claim margins are "so high right now because of AI" :wtf:
I assume he meant “profits” on that second one. The margins are similar to Maxwell, they just sell a lot more of higher priced SKUs.
Posted on Reply
#66
TheinsanegamerN
Bomby569were did you get those numbers from?
and i'm not sure you are making complete sense, the covid had higher margins, now the margin are the same as in the maxwell era but you claim margins are "so high right now because of AI" :wtf:
I said "margins during the height of ADA", not "margins today".

In the graph I linked below, you will find that Nvidia's margins in april of 2023 were hitting roughly the same level they were hitting in late 2014. Shortly after that, the AI market began its boom cycle, and their margins shot up to 75%, where they sit today. They are selling a LOT of $40,000 AI cellerators.

But on purely Geforce sales, no, Ada's price was not "nvidia inflation". It was regular old inflation, courtesy of our glorious leaders printing more money than they spent FIGHTING WWII to react to a nasty cold, driving up the costs of everything to insane levels.
OnasiI assume he meant “profits” on that second one. The margins are similar to Maxwell, they just sell a lot more of higher priced SKUs.
No, I meant margins. I've posted this source many times before

www.macrotrends.net/stocks/charts/NVDA/nvidia/gross-margin

They get their info from Nvidia's quarterly statements.

There's something else interesting in there: During the days of $500 fermi flagships AKA the "good old days", nvidia's margin was only 31% during the 400s series and 38% during the 500 series. Those are closer to what AMD was doing before recently, and we saw how well that turned out.
Posted on Reply
#67
Bomby569
TheinsanegamerNI said "margins during the height of ADA", not "margins today".

In the graph I linked below, you will find that Nvidia's margins in april of 2023 were hitting roughly the same level they were hitting in late 2014. Shortly after that, the AI market began its boom cycle, and their margins shot up to 75%, where they sit today. They are selling a LOT of $40,000 AI cellerators.

But on purely Geforce sales, no, Ada's price was not "nvidia inflation". It was regular old inflation, courtesy of our glorious leaders printing more money than they spent FIGHTING WWII to react to a nasty cold, driving up the costs of everything to insane levels.

No, I meant margins. I've posted this source many times before

www.macrotrends.net/stocks/charts/NVDA/nvidia/gross-margin

They get their info from Nvidia's quarterly statements.

There's something else interesting in there: During the days of $500 fermi flagships AKA the "good old days", nvidia's margin was only 31% during the 400s series and 38% during the 500 series. Those are closer to what AMD was doing before recently, and we saw how well that turned out.
that's not product margins, you're mixing concepts that aren't related. What would overall nvidia's gross margin have to do with the discussion on gpu prices?

and revenue in the segment we care about barely moved since the covid/crypto for the gaming segment, so it's clear they are selling a lot less and for a lot more money (product margin) to almost make the same overall
stockanalysis.com/stocks/nvda/metrics/revenue-by-segment/
you're assumptions are incorrect
Posted on Reply
#68
TheinsanegamerN
Bomby569that's not product margins, you're mixing concepts that aren't related. What would overall nvidia's gross margin have to do with the discussion on gpu prices?
Nvidia's primary product is GPUs. If GPU prices were going up because "muh nvidia inflation" instead of actual inflation, then this would be reflected by nvidias gross margin increasing. It isnt, thus, the price increases are being absorbed by the cost of doing business, higher transportation costs, higher wafer costs, higher wages, ece.
Bomby569and revenue in the segment we care about barely moved since the covid/crypto for the gaming segment, so it's clear they are selling a lot less and for a lot more money (product margin) to almost make the same overall
stockanalysis.com/stocks/nvda/metrics/revenue-by-segment/
you're assumptions are incorrect
If the revenue is flat and they are selling less GPUs, then the price of the GPUs would increase, BUT, so would costs. If cost per GPU was lower, and they sold less but at a higher price, then margin per GPU would be increasing and that would reflect on their overall margin, as this is their primary product. That isnt what is happening though, revenue for gaming is flat and so was their margin before the AI boom. So increased costs are eating the price increases.

My assumptions are sound.
Posted on Reply
#69
neatfeatguy
TheinsanegamerNI'm sorry you hate reality. In just the last 4 years the money supply of most countries has expanded by 50-100%. It is actually crazy to think that, with a doubling of the money supply, entry level burger flippers going from $7.49 an hour to almost $20, and normal office workers going from $50k a year to $80k+, that the price of hardware should just stay the same.
I have no issues with reality. I know prices have spiked everywhere. What doesn't jive is the high price spikes for the top end cards, they don't make any sense no matter what kind of spin you try to put on it. What I do know is that once people started dropping $1200+ for a 3080 it was game over. A 3080 that released for $699 and was soon to be retailing for nearly double at $1200+, that locked things in. Now the xx80 cards are pricing that $1000+ range as you can see from the 4080 and 4080 Super and the 5080 rumored to be around $1250. You can try to justify the high prices all you want, but in the end it was nothing more than consumer's and corporate's greed that caused this. I have every right to bitch about it, doesn't mean anything will be done about it. I'll support my complaint about the prices and not pay them.
TheinsanegamerNSo, I see the problem. you've got categories totally screwed up. the 4070 is a low-mid card? ROFL no. That's the 4060. The 4070 was an upper mid card, the 4080 was the high end, and the 4090 was the silly halo card. The 4070ti would be the low-high end card, even if not the flagship the 4070ti is a VERY capable card.

The 4060ti isnt an entry level card either. The 4060 is low end and the 3050 is technically your entry level. I'd never consider one, I'd consider the 4060 as the entry level.
3050 isn't this gen, Nvidia just didn't do a 4050 because it probably wouldn't have been able to perform any better than the 3060 that came with 12GB and be priced at a point that didn't make any sense with less VRAM over the 3060 model. That leaves us with this gen looking like:

4060s are low tier.
4070s are mid tier (4070 low-mid, 4070 Super and 4070Ti aren't far apart for mid and 4070Ti Super for top-mid).
4080s are high tier.
4090 is top end.
Posted on Reply
#70
redeye
TheinsanegamerNI'm sorry you hate reality. In just the last 4 years the money supply of most countries has expanded by 50-100%. It is actually crazy to think that, with a doubling of the money supply, entry level burger flippers going from $7.49 an hour to almost $20, and normal office workers going from $50k a year to $80k+, that the price of hardware should just stay the same.

So, I see the problem. you've got categories totally screwed up. the 4070 is a low-mid card? ROFL no. That's the 4060. The 4070 was an upper mid card, the 4080 was the high end, and the 4090 was the silly halo card. The 4070ti would be the low-high end card, even if not the flagship the 4070ti is a VERY capable card.

The 4060ti isnt an entry level card either. The 4060 is low end and the 3050 is technically your entry level. I'd never consider one, I'd consider the 4060 as the entry level.

There's other things to look at as well. Maxwell was built on a cheaper node. The 28nm TSMC node cost about $5000 back in 2014. Ada's node was around $20,000. so that's 4x the silicon costs. GDDR6/X memory is also significantly more expensive than GDDR5 was, the coolers are bigger (and quieter), ece. So yeah, the 4070 is about 104mm2 smaller than the 970, but it costs likely twice to three times as much to actually make then maxwell did. Do you expect nvidia to eat a loss there, so you can buy one for $330?

*sigh*

Nvidia's margins during the height of maxwell: 55-57%
Nvidia's margins during peak COVID boom (on the much cheaper samsung 8nm node): 65%
Nvidia's margins during the height of Ada: 56%

Can we let this misinformation die already? I know it's easy to just blame everything on NgReEdIa but the reality is there's more to the world then Nvidia. The onyl reason their margins are so high right now is the AI GPU boom. If they were still primarily Geforce they'd still be about the same margin level.
no, GROSS margins are At least 70% for nvidia, google it “gross margins”!… apple, amd , almost everybody has around 50% GROSS margins, perhaps your numbers are NET margins! and if they are NET margins they are higher than apple, amd, everybodys, GROSS margins!
so nvidia inflation…
Posted on Reply
#71
RedelZaVedno
The truth is Nvidia is not primarily a gaming company anymore. It's third-quarter 2024 gaming revenue was $3.3 billion while data center brought in whooping $30.8 billion! Sure they'll still sell us scaps and left overs they can't sell to data centers, but that's it. Take it or leave it. Jensen sure doesn't care either way as long as his AI enterprise inflow stays uninterrupted. Don't get me wrong, I hate this, but it's just the way it is atm :mad:
Posted on Reply
#72
Hakker
VinceroThat doesn't make Nuclear a 'renewable' which was the original claim / point.
It can get lumped in the 0% CO2 bracket (in terms of operating emissions), but that's it... it hasn't really got any 'green' credentials in terms of environmental impact.
It actually has better green credentials. Solar parks reflect more light than they absorb those also warming up the planet nor is production of the solar cells green for that regard. The same goes for the amount of birds that die by windmill farms. Nuclear is in fact greener than solar. Hydro is the cleanest of them all, but pretty limited in usage and has it's own environmental impacts.
Then you still have the problem of scaling your energy up or down. which is far more difficult with both options than with a nuclear plant. If you want the world to really go green then nuclear is an unavoidable power source at this time unless fusion power makes a breakthrough.

The problem with nuclear plants is that all people see is Chernobyl. A badly designed nuclear plant from the start and then they forget it happened because of bad management/operational error. At the same time it was somewhat a blessing in disguise because the world learned so enormously much from what not to do and it's implications of when it goes horribly wrong.
Posted on Reply
#73
Onasi
RedelZaVednoThe truth is Nvidia is not primarily a gaming company anymore.
They haven’t been since CUDA was introduced. It’s just that the numbers have just caught up. Gaming stopped being a focus the moment GPGPU became a thing and the concept of a “3D accelerator” and “video card” died. I am baffled that it took people almost 18 years to realize this.

@TheinsanegamerN
Thanks for the link to that margins chart, haven’t seen it before. Mighty curious information. I think that your take that the current elevated margins are caused by the AI demand outstripping supply is more than fair then.
Posted on Reply
#74
Hakker
RedelZaVednoThe truth is Nvidia is not primarily a gaming company anymore. It's third-quarter 2024 gaming revenue was $3.3 billion while data center brought in whooping $30.8 billion! Sure they'll still sell us scaps and left overs they can't sell to data centers, but that's it. Take it or leave it. Jensen sure doesn't care either way as long as it's AI enterprise inflow stays uninterrupted. Don't get me wrong, I hate this, but it's just the way it is atm :mad:
Gaming is not Intel or AMDs main income either. Heck neither even see consumers as their main group of income. Server/Datacenters is where both other companies eyes have also been since forever.
Posted on Reply
#75
Vincero
HakkerIt actually has better green credentials. Solar parks reflect more light than they absorb those also warming up the planet nor is production of the solar cells green for that regard. The same goes for the amount of birds that die by windmill farms. Nuclear is in fact greener than solar. Hydro is the cleanest of them all, but pretty limited in usage and has it's own environmental impacts.
Then you still have the problem of scaling your energy up or down. which is far more difficult with both options than with a nuclear plant. If you want the world to really go green then nuclear is an unavoidable power source at this time unless fusion power makes a breakthrough.

The problem with nuclear plants is that all people see is Chernobyl. A badly designed nuclear plant from the start and then they forget it happened because of bad management/operational error. At the same time it was somewhat a blessing in disguise because the world learned so enormously much from what not to do and it's implications of when it goes horribly wrong.
Hey, I'm not arguing against it - I'm not debating what is / isn't better - everything has pro's and con's and there's no such thing as real clean energy yet when you include full lifetime environmental impact.
What this originally stems from was the whole nuclear being lumped into the same bracket as renewables, primarily because it's 'carbon neutral' - that's really a mistake and I'm unsure if it's deliberate by the energy industry to try and lift it's PR rep with the public.

Chernobyl is an outlier - the Fukushima problems are actually more worrying; a supposedly well designed facility in a well off first world country having multiple containment failures. Yeah for sure, a natural disaster started the sequence of events, but it's a reminder that even after a shutdown / scram there is still quite an immediate problem of maintaining control for cooling everything down.
Posted on Reply
Add your own comment
Jan 8th, 2025 13:20 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts