Thursday, September 5th 2024

NVIDIA GeForce RTX 5090 and RTX 5080 Reach Final Stages This Month, Chinese "D" Variant Arrives for Both SKUs

NVIDIA is on the brink of finalizing its next-generation "Blackwell" graphics cards, the GeForce RTX 5090 and RTX 5080. Sources close to BenchLife indicate that NVIDIA is targeting September for the official design specification finalization of both models. This timeline hints at a possible unveiling at CES 2025, with a market release shortly after. The RTX 5090 is rumored to boast a staggering 550 W TGP, a significant 22% increase from its predecessor, the RTX 4090. Meanwhile, the RTX 5080 is expected to draw 350 W, a more modest 9.3% bump from the current RTX 4080. Interestingly, NVIDIA appears to be developing "D" variants for both cards, which are likely tailored for the Chinese market to comply with export regulations.

Regarding raw power, the RTX 5090 is speculated to feature 24,576 CUDA cores paired with 512-bit GDDR7 memory. The RTX 5080, while less mighty, is still expected to pack a punch with 10,752 CUDA cores and 256-bit GDDR7 memory. As NVIDIA prepares to launch these powerhouses, rumors suggest the RTX 4090D may be discontinued by December 2024, paving the way for its successor. We are curious to see how the power consumption is handled and if these cards are packed efficiently within the higher power envelope. Some rumors indicate that the RTX 5090 could reach 600 watts at its peak, while RTX 5080 reaches 400 watts. However, that is just a rumor for now. As always, until NVIDIA makes an official announcement, these details should be taken with a grain of salt.
Sources: BenchLife, via Wccftech
Add your own comment

88 Comments on NVIDIA GeForce RTX 5090 and RTX 5080 Reach Final Stages This Month, Chinese "D" Variant Arrives for Both SKUs

#51
evernessince
TheDeeGeeEveryone will go ape about the TDP, but forget how crazy efficient NVIDIA is.

I run my RTX 4070 Ti at 175 Watt (Default 265) and only lost 5-6% performance.
You are conflating theoretical maximum efficiency with actual real world efficiency. Most people run stock which means their 5090 will be guzzling 600w+ in games.

I run my 4090 power capped but I'm under no illusion that it's something most customers will do.
Posted on Reply
#52
dgianstefani
TPU Proofreader
evernessinceYou are conflating theoretical maximum efficiency with actual real world efficiency. Most people run stock which means their 5090 will be guzzling 600w+ in games.

I run my 4090 power capped but I'm under no illusion that it's something most customers will do.
Even the 4090 which people like to call a 600 W card since the power connector can handle up to 600 W, uses 350 W in games at stock.



I doubt the 5090 will be much different, maybe 450 W.
Posted on Reply
#53
RogueSix
dgianstefaniEven the 4090 which people like to call a 600 W card since the power connector can handle up to 600 W, uses 350 W in games at stock.



I doubt the 5090 will be much different, maybe 450 W.
Exactly. Most of us will rarely (if ever) even get close to peak power consumption. Me, for example, I have a 43" ASUS ROG monitor with 4K/144Hz resolution and I'm always limiting my fps in games to 120fps whenever possible. Otherwise, I have V-Sync enabled globally which is nVidia's recommended setting for G-Sync (compatible) displays, i.e. I will always stay within the sync range of 144Hz/fps max.

The RTX 5090 will be able to reach those 120fps or 144fps even easier than the RTX 4090 so it will be a beast in terms of performance and efficiency. The theoretical 600W peak power consumption which is already present on some RTX 4090 custom designs (not mine... MSI Suprim X has a 520W limit) is of very little practical relevance.
Posted on Reply
#54
GhostRyder
OnasiHalf a year. Closer to a third, really. CES is early January. And this is in line with what we’ve heard about 5000 series being planned for early 2025. It makes some amount of sense - NV doesn’t have real competition to rush them along, the 4000 series still sells well and getting rid of existing inventory before launching new cards is prudent and for now NV is probably more keen on using fab allocation on enterprise Blackwell accelerators to fully capitalize on the demand.
You're right, I had the wrong event in my head when I wrote that so its likely sometime in Q1.
dgianstefaniEven the 4090 which people like to call a 600 W card since the power connector can handle up to 600 W, uses 350 W in games at stock.



I doubt the 5090 will be much different, maybe 450 W.
I mean sure but that's still pretty high and its also going to vary based on the game. 450watts is still alot when you think about it especially when you need to cool it. But then again I think power consumption gets overblown on the top end constantly, if you are buying this card you probably only care about that extra 5 FPS you can get over everyone else and not how few watts it can sip. I only ever complain about power consumption when it becomes too hard to keep it cool and it gets held back because of it.
Posted on Reply
#55
dgianstefani
TPU Proofreader
GhostRyderYou're right, I had the wrong event in my head when I wrote that so its likely sometime in Q1.


I mean sure but that's still pretty high and its also going to vary based on the game. 450watts is still alot when you think about it especially when you need to cool it. But then again I think power consumption gets overblown on the top end constantly, if you are buying this card you probably only care about that extra 5 FPS you can get over everyone else and not how few watts it can sip. I only ever complain about power consumption when it becomes too hard to keep it cool and it gets held back because of it.
"5 FPS".

Approx 50% faster than the (more expensive) 3090 Ti.



That's while using 100 W less than the 3090 Ti BTW.



I don't expect quite so much of a jump from the 4090 to the 5090 since it's not getting a major node improvement, but there's still a lot to gain from architecture, and I do get tired of "600 W" when it's closer to half that in stock gaming draw, and ~'miniscule differences that you can't notice' criticism of halo products.
Posted on Reply
#56
AnotherReader
dgianstefaniEven the 4090 which people like to call a 600 W card since the power connector can handle up to 600 W, uses 350 W in games at stock.



I doubt the 5090 will be much different, maybe 450 W.
In raytraced games, the power consumption is much higher.

Posted on Reply
#57
gffermari
It was already quite big the gap between the 4090 and 4080. But 24K vs 10K. That’s insane.
Posted on Reply
#58
dgianstefani
TPU Proofreader
AnotherReaderIn raytraced games, the power consumption is much higher.

Sure, but it's still not "600 W".
gffermariIt was already quite big the gap between the 4090 and 4080. But 24K vs 10K. That’s insane.
Half the memory bus too 256 vs 512 bit, but these are just rumours, I strongly suspect that's just full die numbers, and the actual chips will be cut down so the difference won't be as significant.

4090-4080 e.g. 4080S uses the full die but 4090 isn't even close.
Posted on Reply
#59
AnotherReader
dgianstefaniSure, but it's still not "600 W".
I don't think even the factory overclocked 4090 variants use 600 W at stock.
Posted on Reply
#60
dgianstefani
TPU Proofreader
AnotherReaderI don't think even the factory overclocked 4090 variants use 600 W at stock.
Yes, despite the hyperbole people like to use to emphasise a non-existent point regarding what they consider "efficiency". The waterblocked ROG Matrix $3k card as an example uses 440 W instead of 411 W (FE), which isn't even 10% more power, for about 4% more speed, and that's about as extreme as it gets for OC'd cards.
Posted on Reply
#61
GhostRyder
dgianstefani"5 FPS".

Approx 50% faster than the (more expensive) 3090 Ti.



That's while using 100 W less than the 3090 Ti BTW.



I don't expect quite so much of a jump from the 4090 to the 5090 since it's not getting a major node improvement, but there's still a lot to gain from architecture, and I do get tired of "600 W" when it's closer to half that in stock gaming draw, and ~'miniscule differences that you can't notice' criticism of halo products.
I never said 600 or mentioned that, also the 5 FPS remark was more just saying people generally want the extra performance and not expecting best efficiency when purchasing a halo product. Still we wont know till it comes out how much it uses peak when gaming.
Posted on Reply
#62
dgianstefani
TPU Proofreader
GhostRyderI never said 600 or mentioned that, also the 5 FPS remark was more just saying people generally want the extra performance and not expecting best efficiency when purchasing a halo product. Still we wont know till it comes out how much it uses peak when gaming.
If you scroll up you'll see 600 W mentioned quite a few times.
Posted on Reply
#63
uco73
The more frames, the higher the consumption. I've been turning on V-sync for a long time in every game. When the frames are over 200 or over 300 or more, the graphics consumption is over 300W. When V-sync is on, graphics consumption is below 100W. And what to say to all that? Is there a need for the frames to be over 60. How many frames per second is the human eye capable of processing?
Posted on Reply
#64
dgianstefani
TPU Proofreader
uco73The more frames, the higher the consumption. I've been turning on V-sync for a long time in every game. When the frames are over 200 or over 300 or more, the graphics consumption is over 300W. When V-sync is on, graphics consumption is below 100W. And what to say to all that? Is there a need for the frames to be over 60. How many frames per second is the human eye capable of processing?
More than you think. The way the eye/brain works things are somewhat parallel, so while each cluster of "sensors" within the eye can only see up to a certain "framerate", your brain processes many clusters at the same time, so to the brain, the framerate can be much higher than what the eye "hardware" can strictly do, because it's working with more than a single "input".

But you're right about locking max frame rate to max Hz of your monitor, makes for much lower input lag anyway.
Posted on Reply
#65
HughMungus
Legacy-ZAWell, because, there will be in-betweens, the 5080 Ti, The 5080 Super, the 5080 Super Duper, the 5080 Hyper Speed, the 5080 GTFO etc. :roll:
Not far from the truth
Posted on Reply
#66
evernessince
dgianstefaniYes, despite the hyperbole people like to use to emphasise a non-existent point regarding what they consider "efficiency". The waterblocked ROG Matrix $3k card as an example uses 440 W instead of 411 W (FE), which isn't even 10% more power, for about 4% more speed, and that's about as extreme as it gets for OC'd cards.
I'm not sure who's making the argument that they are 600w cards but yeah they aren't. The 5090 might be though, we will have to wait and see.
uco73The more frames, the higher the consumption. I've been turning on V-sync for a long time in every game. When the frames are over 200 or over 300 or more, the graphics consumption is over 300W. When V-sync is on, graphics consumption is below 100W. And what to say to all that? Is there a need for the frames to be over 60. How many frames per second is the human eye capable of processing?
You should cap your FPS at whatever you feel comfortable with. There's a lot of variation from person to person in regards to motion smoothness. Some people are fine with 60 FPS while others want 500+.
Posted on Reply
#67
yfn_ratchet
Icon CharlieHMMMMMM..... I find this MOST interesting on this sort of market speak... I wonder this has anything to do with the decline of stock prices OF LATE... SINCE NGREEDIA IS A A....I.... COMPANY NOW....
Yup I can see the GREENIES who drank the green coolaid going for the biggest and bestest and AWESOMER NEWER CARD, BASICALLY they fail fail on the basic needs of life. But they sure get their new shiny!


These are the people that give NGREEDIA the money to continue the mantra of....
THE MOAR YOU BUY.... THE MOAR YOU SAVE...SAVE...SAVE...SAVE...SAVE...
Ah, hello there No-Bark. Good to see you're still kicking.

In any case, I can concede that the newer cards are going to be more 'efficient' in terms of joule/unit of work, but re: the argument above it shouldn't be the responsibility of the consumer to closely attend what is actually relevant to their living expenses and their own indoors comfort: raw power draw/heat output. The further and further that GPU manufacturers push the power ceiling on their product in this slow creeping fashion the worse and worse it gets. That's still a bad thing.

In an ideal world a new arch + new node would mean a significant (if merely 'generational') bump in performance on the same power budget at stock card-for-card, not a HUGE jump in performance for a slightly less huge jump in power draw. And that would be the selling point. This isn't datacenter, with industrial chillers and 240V/100A wall outlets. This is a small indoor room on a tiny sliver of a 3-ton unit's capacity and a 120V outlet that pops if you edge above 15A total per room. The ceiling is far lower.

I remember when people would make fun of the 480 and called it a space heater/George Foreman. That thing drew 250W max. The 4070 Super is 30W below that on tech a decade newer.

I'm very much for UV, but that's because I find it interesting how low you can push the silicon before it starts to drag its feet and because I live in a place that is very hot for half the year. I shouldn't be hearing complaints from my Michigander friend about how his 5080 makes him sweat in March. I shouldn't be telling him 'kick rocks, make your computer slower'.
Posted on Reply
#68
Dr. Dro
the54thvoidMy 2080ti had way more class than your 4090. :D

(I'm sure you also had a 2080ti).
Pfft it was us RTX 3090 owners who had all the class :laugh:

Posted on Reply
#69
Minus Infinity
BwazeRTX 4080 is $1200. Even without AI craze Jensen told us there is no More's Law any more, any increase in performance will bring increase in price.

You can take this as a rough guide:

$1200 + 50% = $1800, and that doesn't even cover the inflation!
So you think 5090 will be $7200?
Posted on Reply
#70
Minus Infinity
RogueSixExactly. Most of us will rarely (if ever) even get close to peak power consumption. Me, for example, I have a 43" ASUS ROG monitor with 4K/144Hz resolution and I'm always limiting my fps in games to 120fps whenever possible. Otherwise, I have V-Sync enabled globally which is nVidia's recommended setting for G-Sync (compatible) displays, i.e. I will always stay within the sync range of 144Hz/fps max.

The RTX 5090 will be able to reach those 120fps or 144fps even easier than the RTX 4090 so it will be a beast in terms of performance and efficiency. The theoretical 600W peak power consumption which is already present on some RTX 4090 custom designs (not mine... MSI Suprim X has a 520W limit) is of very little practical relevance.
Well yeah, lock it at 120fps and almost certainly will use less power than 4090. But a lot of people will just let it rip and at 200fps it'll be power hungry.

So would people actually buy a 5090 that got similar performance as the 4090 but at half the power say? No, IMO they'll want to see headlines reading 40% stronger and ignore power consumption. AMD is getting hammered for basically doing this with Zen 5.
Posted on Reply
#71
AusWolf
the54thvoidMy 2080ti had way more class than your 4090. :D

(I'm sure you also had a 2080ti).
Class in GPUs died together with the Titan name, imo. Even a x90 GeForce card is just a consumer product like any other.
Posted on Reply
#72
TheinsanegamerN
I'm quite happy with my 6800xt, but I've wandered into the possibility of going from 2k144 to 4k144, and boy does 5090 sound like fun. I could do watercooling again too.

I've been full AMD since polaris but I may have to switch back.
Posted on Reply
#73
dgianstefani
TPU Proofreader
TheinsanegamerNI'm quite happy with my 6800xt, but I've wandered into the possibility of going from 2k144 to 4k144, and boy does 5090 sound like fun. I could do watercooling again too.

I've been full AMD since polaris but I may have to switch back.
Having DLAA/decent RT is nice too, I assume you care about IQ since you are moving to 4K.
Posted on Reply
#74
TheinsanegamerN
dgianstefaniHaving DLAA/decent RT is nice too, I assume you care about IQ since you are moving to 4K.
Perhaps in the future. Right now nothing I play uses either. Thing is, even modded Deep Rock Galactic or just Civ VI maxed out seems to tax the 6800xt a bit too hard, cant maintain 144 now and moving to 4k will only make it worse.

If I get into the new warhammer or something similar I feel its going to become more evident.
Posted on Reply
#75
dgianstefani
TPU Proofreader
TheinsanegamerNPerhaps in the future. Right now nothing I play uses either. Thing is, even modded Deep Rock Galactic or just Civ VI maxed out seems to tax the 6800xt a bit too hard, cant maintain 144 now and moving to 4k will only make it worse.

If I get into the new warhammer or something similar I feel its going to become more evident.
DRG has DLAA.
Posted on Reply
Add your own comment
Nov 16th, 2024 21:19 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts