Friday, March 25th 2022

NVIDIA GeForce RTX 4090/4080 to Feature up to 24 GB of GDDR6X Memory and 600 Watt Board Power

After the data center-oriented Hopper architecture launch, NVIDIA is slowly preparing to transition the consumer section to new, gaming-focused designs codenamed Ada Lovelace. For starters, the source claims that NVIDIA is using the upcoming GeForce RTX 3090 Ti GPU as a test run for the next-generation Ada Lovelace AD102 GPU. Thanks to the authorities over at Igor's Lab, we have some additional information about the upcoming lineup. We have a sneak peek of a few features regarding the top-end GeForce RTX 4080 and RTX 4090 GPU SKUs. According to Igor's claims, NVIDIA is testing the PCIe Gen5 power connector and wants to see how it fares with the biggest GA102 SKU - GeForce RTX 3090 Ti.

Additionally, we find that the AD102 GPU is supposed to be pin-compatible with GA102. This means that the number of pins located on GA102 is the same as what we are going to see on AD102. There are 12 places for memory modules on the AD102 reference design board, resulting in up to 24 GB of GDDR6X memory. As much as 24 voltage converters surround the GPU, NVIDIA will likely implement uP9512 SKU. It can drive eight phases, resulting in three voltage converters per phase, ensuring proper power delivery. The total board power (TBP) is likely rated at up to 600 Watts, meaning that the GPU, memory, and power delivery combined output 600 Watts of heat. Igor notes that board partners will bundle 12+4 (12VHPWR) to four 8-pin (PCIe old) converters to enable PSU compatibility.
Source: Igor's Lab
Add your own comment

107 Comments on NVIDIA GeForce RTX 4090/4080 to Feature up to 24 GB of GDDR6X Memory and 600 Watt Board Power

#76
oobymach
GamerGuyBah, I have an Enermax MAX REVO 1500W PSU that should be able to handle this, though I ain't looking at any RTX 4000 series cards, I'd be looking at an RX 7000 series card, will see the difference in performance and price between the RX 7800 XT and the RX 7900 XT.
1.5kw should be enough for a 4000 series, I have a 1kw with 4x 8pin so I could run one but not looking forward to it.
Posted on Reply
#77
Tartaros
nguyenbut hey DLSS is bad according to some too.
Wat. Who. How.
Posted on Reply
#78
nguyen
TartarosWat. Who. How.
There are purists here who think Native is the only way to enjoy games.
Posted on Reply
#79
AusWolf
My 550 W Seasonic Platinum PSU still has 8 years of its warranty left. I'm not going to bin it just because nvidia can't keep TDPs in check. Low spec gaming awaits! :cool:
Posted on Reply
#80
Mescalamba
Given price of electricity is probably only going up, I dont see this 600W feature being good for users.. or nVidia.
Posted on Reply
#81
oxrufiioxo
nguyenThere are purists here who think Native is the only way to enjoy games.
Yeah, good for them.... I turn on DLSS in pretty much every game that supports 2.0 or later..... On a 48 inch oled from a normal sitting distance it's impossible to tell it apart from native.... I don't mind it at 1440p either though. It's really only at 1080p where I would only use it if I had a potato 2060 trying to play Cyberpunk....

Not a huge fan of the high end getting close to 600W though my 3080ti at 400w is already semi obnoxious to keep the room it's in cool as it is. Guessing they will end up actually being 450-500w cards just like custom 3090s though.
Posted on Reply
#82
DR4G00N
I don't see why the power consumption is a big deal, I mean in the past people where running 4-way sli/cfx with OC'ed 480/580's or 5870/5970's sucking down over 1KW for the cards alone to get the most performance possible. Now that multi-gpu is sadly dead I don't see how this is any different.
If your wanting a 4080/4090 to get the best performance possible the cost to run it likely doesn't matter to you.
Posted on Reply
#83
Totally
MaenadFINYeah on some cases there's been rising in the pricing of electricity here in Finland too. Though on my case, the provider actually informed that my pricing will be the same as before.


Yeah, good point! I remember when efficiency was almost the top thing back then. I remember especially when Core 2 Duo came, it indeed was hella efficient over the previous P4/PD CPUs.
Yeah, all that efficiency talk stopped with Fermi, then came back in vogue briefly during pascal then was yeeted out the window soon after in time for Turing. People in this space really only cared about efficiency when it was Nvidia. Other brand is more efficient? Cool beans, they're still second place. Other brand seized the crown? Look at how power hungry/innefficient that card is! Need a nuclear reactor.

People in this thread are griping about about how much power it auch down because in their head they know it's going to cost more than they'd care to pay. If it was priced sanely. The conversation would be more along the lines of the usual "efficiency, who tf cares about efficiency? If you can afford the card you can afford to power and extra light bulb, or it's only a few cents on the electric bill". And it was even staff and moderators on this site saying things like that.
Posted on Reply
#84
Assimilator
TotallyYeah, all that efficiency talk stopped with Fermi, then came back in vogue briefly during pascal then was yeeted out the window soon after in time for Turing. People in this space really only cared about efficiency when it was Nvidia. Other brand is more efficient? Cool beans, they're still second place. Other brand seized the crown? Look at how power hungry/innefficient that card is! Need a nuclear reactor.

People in this thread are griping about about how much power it auch down because in their head they know it's going to cost more than they'd care to pay. If it was priced sanely. The conversation would be more along the lines of the usual "efficiency, who tf cares about efficiency? If you can afford the card you can afford to power and extra light bulb, or it's only a few cents on the electric bill". And it was even staff and moderators on this site saying things like that.
Utter rubbish.
Posted on Reply
#85
SKD007
And here i am setting gpu limit to 75% and frame limit to 58fps on my 3080TI waiting to upgrade

hopefully AMD can do it with 400w so i can switch back.
Posted on Reply
#86
destruya
So in other words the point of the article is "buy 3090s now while they're the closest they've been to their original MSRP in nearly two years' time and will work with your current PSU."
Posted on Reply
#87
Lycanwolfen
Who remembers when Nvidia said they were going to make video cards have more graphic power while using less real power. That was the pitch 5 years ago. And their roadmap showed it less and less power hungry video card with more output for graphic power. Strange how that all went to the trash can.
Posted on Reply
#88
GamerGuy
DR4G00NI don't see why the power consumption is a big deal, I mean in the past people where running 4-way sli/cfx with OC'ed 480/580's or 5870/5970's sucking down over 1KW for the cards alone to get the most performance possible. Now that multi-gpu is sadly dead I don't see how this is any different.
If your wanting a 4080/4090 to get the best performance possible the cost to run it likely doesn't matter to you.
Yes, but these were more the exceptions rather than the rule. I ran CF and SLi setups (last being 2x GTX Titan, 2x Vega64, they were in 2 different rigs which I was running at the same time) through the years, and have been using 1kW PSU since getting a SilverStone OP1000 about a decade or more back.
oobymach1.5kw should be enough for a 4000 series, I have a 1kw with 4x 8pin so I could run one but not looking forward to it.
Heck, It's possible all the PSUs I have are able to do it - Seasonic X-1250, Corsair HX1050, Corsair HX1000 (platinum) and my spare Enermax MAX REVO 1500W, these have at least 4x 8pin PCIe power outputs each, I think the MAX REVO has 6x PCIe (not sure though).
Posted on Reply
#89
lemoncarbonate
oobymach600watts = 4x 8pin connectors from a psu, so you probably need a psu upgrade just to run one.
That's one whole PSU right there. They should just bundle it with external power adapter like laptop's.
Posted on Reply
#90
SIGSEGV
HTCThe 4090 has already been tested:


Dunno about the 4080 ...
LMAO, love this guy. He's also already leaked the unknown RTX 4090Ti card.
damn, we really need a private nuclear silo right in front of our house.
Posted on Reply
#91
GamerGuy
SIGSEGVLMAO, love this guy. He's also already leaked the unknown RTX 4090Ti card.
damn, we really need a private nuclear silo right in front of our house.
That isn't the rumored RTX 4090 Ti, I'm sorry but that's totally wrong! I've heard rumors that leather man himself is gonna call it the RTX 4090 Nuclear Edition.
Posted on Reply
#92
Mussels
Freshwater Moderator
nguyenLatest rumor say 4090 can be 2-2.5x faster than 3090 at 4K, that would be insane for one generation leap.
Those rumours and early results are always about stupid scenarios, oh yes its twice as fast - at 8K with RTX on and DLSS (2FPS to 4FPS)


This is also going to be something stupid like a 600W capable power connector, not that the GPU's use it
Posted on Reply
#93
ratirt
600W. That's a nice toaster. I like toasts but not made of silicon. Double on the core and double on the power. Not so much of an improvement to me.
Btw, it is just a rumor. I'm gonna make popcorn sit tight and see how many dead bodies fall from the closet :)
Posted on Reply
#94
ZoneDymo
600 watt capable is not 600 watt usage, we all agree on that.
However why would one need to make it 600 watt capable if not to at least kinda approach it.

it is getting too high and I said it before, its barely technological progress if yes we can do more but it also costs more energy to achieve.
Sure the performance per watt might go up....but clearly not enough if the wattage has to scale up like this constantly
Posted on Reply
#95
Legacy-ZA
No hidden surprises? I see I need to install a certificate, and the program uses a silent updater, not sure how I feel about that. :)
nguyenI'm using DLSS Swapper which does exactly what you want "swap between DLSS versions for the games that support them with ease", although it only works with Steam.
Posted on Reply
#96
Vario
HTCThe 4090 has already been tested:


Dunno about the 4080 ...
What a great youtube channel! I really like the retro Voodoo inspired cards. He must put a lot of labor into these replicas.
Posted on Reply
#97
ratirt
ZoneDymo600 watt capable is not 600 watt usage, we all agree on that.
However why would one need to make it 600 watt capable if not to at least kinda approach it.

it is getting too high and I said it before, its barely technological progress if yes we can do more but it also costs more energy to achieve.
Sure the performance per watt might go up....but clearly not enough if the wattage has to scale up like this constantly
When i read the OP's article about the rumor or whatever it is, it says board maximum power not cable requirements.
Posted on Reply
#98
DeeJay1001
chrcolukDoes America have the same energy crisis as UK? A guy mining stopped when he had a £500 month electric bill ($660).

If I ran the GPU in the UK without undervolt and no FPS cap, I think I would pay more in electric than buying the card in the first year. :)
Electricity is cheap and plentiful here on the Eastern side of the US. I pay well under 10c per kW/h. My usage is significantly higher than other homes in mu development and I've never heard a peep from the supplier or line provider.
Posted on Reply
#99
Vayra86
TotallyYeah, all that efficiency talk stopped with Fermi, then came back in vogue briefly during pascal then was yeeted out the window soon after in time for Turing. People in this space really only cared about efficiency when it was Nvidia. Other brand is more efficient? Cool beans, they're still second place. Other brand seized the crown? Look at how power hungry/innefficient that card is! Need a nuclear reactor.

People in this thread are griping about about how much power it auch down because in their head they know it's going to cost more than they'd care to pay. If it was priced sanely. The conversation would be more along the lines of the usual "efficiency, who tf cares about efficiency? If you can afford the card you can afford to power and extra light bulb, or it's only a few cents on the electric bill". And it was even staff and moderators on this site saying things like that.
Efficiency is and was always a thing during every single day where we still had new nodes ahead of us. Efficiency at its core, determines how far you can push a node without deviating from whatever the norm in the market is, for numerous aspects of the product. Price, performance, heat, power, etc. Efficiency is also IPC. It is die size. It is featureset. Etc.

Now that the gain from a node shrink is diminishing, suddenly efficiency is out the window and we ride the marketing wave to convince ourselves we're progressing, even though there was never a day and age where hardware could last longer than today, simply because we're down to baby steps at best.

The reality is, we're not progressing, we're degressing when GPUs need substantially more power to produce playable frames in gaming. What's underneath that, is really not quite relevant. The big picture is that in a world where efficiency is key to survive (literally), we're buying into products that counter that idea. It won't last, it can't last, and the deeper we go down that rabbit hole, the bigger the hurdle to get out of it again. RT is diving straight into that hole. DLSS/FSR are clawing us back out a little bit, but the net result is still that we're in need of more power to drive content.

Its not progress. What is progress, is the fact that GPUs keep going faster. Sure. Another bit of progress is that you can still get or create an efficient GPU. I can understand that notion too. But the commercial reality is none of those things, as @Assimilator correctly states, because quite simply commerce is still all about more bigger faster stronger, fuck all consequences as long as we keep feeding the top 3%. The question is how far your Lemming mind wants to go before diving off the cliff.

The bottom line and fundamental question here is are you part of the 3% and if you're not, you're an idiot for feeding them further. Vote with wallet, or die - or, present a horrible future to your children.

And that principle stands in quite a few of our current commercial adventures. The responsibility to change is on us, no one else.
Posted on Reply
#100
HTC
VarioWhat a great youtube channel! I really like the retro Voodoo inspired cards. He must put a lot of labor into these replicas.
Agreed. Dude usually only makes 1 video per year ... but ALL OF THEM are great, and i recommend them to all.

As for the topic, nVidia should focus on cutting down the gap between RT on and off instead of focusing on more FPS: for that matter, so should AMD and their RT version. Not that more FPS isn't a good thing: it's just that the gap is really that severe.
Posted on Reply
Add your own comment
Nov 25th, 2024 21:37 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts