Friday, March 25th 2022

NVIDIA GeForce RTX 4090/4080 to Feature up to 24 GB of GDDR6X Memory and 600 Watt Board Power

After the data center-oriented Hopper architecture launch, NVIDIA is slowly preparing to transition the consumer section to new, gaming-focused designs codenamed Ada Lovelace. For starters, the source claims that NVIDIA is using the upcoming GeForce RTX 3090 Ti GPU as a test run for the next-generation Ada Lovelace AD102 GPU. Thanks to the authorities over at Igor's Lab, we have some additional information about the upcoming lineup. We have a sneak peek of a few features regarding the top-end GeForce RTX 4080 and RTX 4090 GPU SKUs. According to Igor's claims, NVIDIA is testing the PCIe Gen5 power connector and wants to see how it fares with the biggest GA102 SKU - GeForce RTX 3090 Ti.

Additionally, we find that the AD102 GPU is supposed to be pin-compatible with GA102. This means that the number of pins located on GA102 is the same as what we are going to see on AD102. There are 12 places for memory modules on the AD102 reference design board, resulting in up to 24 GB of GDDR6X memory. As much as 24 voltage converters surround the GPU, NVIDIA will likely implement uP9512 SKU. It can drive eight phases, resulting in three voltage converters per phase, ensuring proper power delivery. The total board power (TBP) is likely rated at up to 600 Watts, meaning that the GPU, memory, and power delivery combined output 600 Watts of heat. Igor notes that board partners will bundle 12+4 (12VHPWR) to four 8-pin (PCIe old) converters to enable PSU compatibility.
Source: Igor's Lab
Add your own comment

107 Comments on NVIDIA GeForce RTX 4090/4080 to Feature up to 24 GB of GDDR6X Memory and 600 Watt Board Power

#26
Unregistered
What's the problem, intel ADL is only 240 so this is only a tad more. /s
Posted on Edit | Reply
#27
mechtech
hmmmmm this or new car

tough call
Posted on Reply
#28
Tartaros
*Flashbacks of GTX400 and GTX500 times*
Posted on Reply
#29
oobymach
600watts = 4x 8pin connectors from a psu, so you probably need a psu upgrade just to run one.
Posted on Reply
#30
TechLurker
I wonder if we'll start seeing auxiliary GPU PSUs that fit into a 120mm x 120mm or 140mm x 140mm fan mount and as thick as a 45mm-60mm radiator, in the same vein as former 5.25 Bay auxiliary PSUs from VisionTek, FSP, and Thermaltake (off the top of my head) that were used to provide power to GPUs of the time because consumer PSUs couldn't keep up.



Or if dual-PSU cases become a thing again, because dual-systems is so 2018 and it's cheaper to buy 2 800w Bronze PSUs than 1 1600w Gold PSU. Assuming it's dual-PSU cases again, Phanteks will need an updated version of their PSU Combo adapter with the new 12+4 pin GPU connector instead of the older 8-pins. Or go back to using slaved PSU adapters for delay-turning on the second PSU. At least we now have the smaller SFX PSUs that could be squeezed into cases next to the big ATX PSUs.
Posted on Reply
#31
thegnome
oobymach600watts = 4x 8pin connectors from a psu, so you probably need a psu upgrade just to run one.
1000w+ should do... That is if we don't see power spikes on the newer cards, which are hopefully gone but if they are still there you'd probably need atleast 1200w or more...

Oh and don't forget that's only with the top end sku's, the 4070 should be two 8pin and the 4080 three 8pin, and if you consinder undervolting (which is probably much more common on those cards) you probably wouldn't need a psu upgrade.

I do wonder though how the newer PSU's will look, say 1000w, will it keep having 4/5 8pin connectors or will it just be one or two with a 450/600w capable 16pin?
Posted on Reply
#32
MentalAcetylide
RedelZaVedno600W? Thanks, but no thanks. Summers here are hot AF, the last thing I need is additional room heater. Just imagine electricity bill, 800W for gaming PC, 300 Watts for air conditioner... That's +1KW/H consumption in order to game. Jensen has totally lost it.
If I owned hell and a room with a system that is running a top end 4000-series card, I would rent the room out and live in hell. :laugh: -Chronicles of NVidia
Posted on Reply
#33
wolf
Performance Enthusiast
defaultluserI'd be surprised if this thing tops 450w stock!
Same, 600w makes for great shock and awe headlines, but it's very unlikely to be what stock GPU's call for this gen.
Aquinuspeople said the R9 390 and Vega 64 were power hungry
They certainly were.
AssimilatorLovelace won't be anywhere near 600W, except the craziest overclocking SKUs. I will repeat, just because the 12VHPWR allows up to 600W power draw, does not in any way, shape or form mean that Lovelace will draw that.
Very much this. The new spec of power delivery allows 600w, I would massively doubt even the 4090 rolls out needing anywhere close to 600w.
Posted on Reply
#34
DoLlyBirD
I remember all the "modern PC's are much more efficient than older ones" "use less voltage" "improved efficiency per watt" talk, etc etc, meanwhile back in 2005 with 2.4v/3v DDR, 2v+ CPU's and "inefficient" GPU's that required.... wait for it.... external pcie 6 pin/4pin molex POWA :eek: running a top of the range gaming PC on a 300w PSU, but..... 2022 save the planet, buy a platinum 1kw PSU to run your 450w GPU and 250w CPU.... PROGRESS :rolleyes:
Posted on Reply
#35
Keullo-e
S.T.A.R.S.
chrcolukDoes America have the same energy crisis as UK? A guy mining stopped when he had a £500 month electric bill ($660).

If I ran the GPU in the UK without undervolt and no FPS cap, I think I would pay more in electric than buying the card in the first year. :)
Yeah on some cases there's been rising in the pricing of electricity here in Finland too. Though on my case, the provider actually informed that my pricing will be the same as before.
DoLlyBirDI remember all the "modern PC's are much more efficient than older ones" "use less voltage" "improved efficiency per watt" talk, etc etc, meanwhile back in 2005 with 2.4v/3v DDR, 2v+ CPU's and "inefficient" GPU's that required.... wait for it.... external pcie 6 pin/4pin molex POWA :eek: running a top of the range gaming PC on a 300w PSU, but..... 2022 save the planet, buy a platinum 1kw PSU to run your 450w GPU and 250w CPU.... PROGRESS :rolleyes:
Yeah, good point! I remember when efficiency was almost the top thing back then. I remember especially when Core 2 Duo came, it indeed was hella efficient over the previous P4/PD CPUs.
Posted on Reply
#36
DoLlyBirD
MaenadFINYeah on some cases there's been rising in the pricing of electricity here in Finland too. Though on my case, the provider actually informed that my pricing will be the same as before.


Yeah, good point! I remember when efficiency was almost the top thing back then. I remember especially when Core 2 Duo came, it indeed was hella efficient over the previous P4/PD CPUs.
Double performance at the same or less power draw, now we have 250w CPU's and 450w GPU's, have we really moved on? 95w blazing hot high clocked P4's lol
Posted on Reply
#38
Keullo-e
S.T.A.R.S.
DoLlyBirDDouble performance at the same or less power draw, now we have 250w CPU's and 450w GPU's, have we really moved on? 95w blazing hot high clocked P4's lol
The funniest thing is that back then the hottest P4s felt like a stove and everyone joked about them, modern Intel chips draw way more power but it's not as bad as in the P4 days..
Posted on Reply
#39
Mussels
Freshwater Moderator
I wonder how long before governments kick in and place limits on stupidly high wattage PC parts - i know at least one place did
Posted on Reply
#40
Sabotaged_Enigma
Next time I'll turn it into something I can cook my meals on.
Posted on Reply
#41
nguyen
Latest rumor say 4090 can be 2-2.5x faster than 3090 at 4K, that would be insane for one generation leap.
Posted on Reply
#42
Solid State Soul ( SSS )
Vayra86102 at 600W!? Wow. So they doubled their TDP overnight (-gen) even counting a shrink. Nice. Is this Nvidia doing a lil' AMD here? Moar moar moar because our core tech is really EOL with bandaids? Or is this the new reality... and will AMD follow suit. Either way, Nvidia is giving away a LOT of room to play for AMD to come up with something silly. I'm just trying to let it sink in here. Six. Hundred. Watts. I mean, two generations back we had the same SKU at 28
Nvidia prioritize RT cores and Tensor core performance improvements over CUDA cores performance per watt, this is why since turning Nvidia cards consumes more and more power each gen, people want their rAyTrAcInG and dLsS.... well here you go, 600w GPUs son!
nguyenLatest rumor say 4090 can be 2-2.5x faster than 3090 at 4K, that would be insane for one generation leap.
They say that but they didint clarify if its RT cores performance or Cuda cores performance, they promised similar claims with ampere only to find out in reviews its ray tracing performance and not that substantial actual gpu performance
DoLlyBirD2022 save the planet, buy a platinum 1kw PSU to run your 450w GPU and 250w CPU.... PROGRESS :rolleyes:
Nobody cares about the planet, corporations just say that to gain karama points all while they release countless products year after year like smartphones for example, just look how many phones Samsung have released in just 2020 and 2021 alone, all while they stamp ( save the planet) on their packaging and removes chargers lol
Posted on Reply
#43
Richards
If this is less than 2x performance uts a flop.. because amd will take the performance crown with the 7900xt
Posted on Reply
#45
nguyen
Solid State Soul ( SSS )Nvidia prioritize RT cores and Tensor core performance improvements over CUDA cores performance per watt, this is why since turning Nvidia cards consumes more and more power each gen, people want their rAyTrAcInG and dLsS.... well here you go, 600w GPUs son!


They say that but they didint clarify if its RT cores performance or Cuda cores performance, they promised similar claims with ampere only to find out in reviews its ray tracing performance and not that substantial actual gpu performance


Nobody cares about the planet, corporations just say that to gain karama points all while they release countless products year after year like smartphones for example, just look how many phones Samsung have released in just 2020 and 2021 alone, all while they stamp ( save the planet) on their packaging and removes chargers lol
Just stick to a TDP figure you are comfortable with and let others enjoy their 500W GPU :D (not me, I would just undervolt it down to ~300W), I'm sure the AD104/106/107 chips will be ultra efficient and more suitable for your gaming need.

Ada is just a die shrink of Ampere, so RT/Tensor perf would scale linearly with raster compare to Ampere.

There are AI supercomputers doing all the planet saving stuffs so in the end capitalism save itself from the problem it created :D. This would be a bit surprising to you but Capitalist countries have healthier air quality than non capitalist countries.
Posted on Reply
#46
Cutechri
nguyenThis would be a bit surprising to you but Capitalist countries have healthier air quality than non capitalist countries.
Of course, when you offload all the misery of production to either third world countries or those 'non-capitalist' ones (that are still capitalist). That being said, please do not continue this conversation as there's no point in yet another thread to go political.
Posted on Reply
#47
Why_Me
CutechriOf course, when you offload all the misery of production to either third world countries or those 'non-capitalist' ones (that are still capitalist). That being said, please do not continue this conversation as there's no point in yet another thread to go political.
You mean jobs?
Posted on Reply
#48
Flydommo
The upcoming RTX 4080/90 lineup is obviously directed at enthusiast 4K gamers. I would assume these cards will deliver a very significant performance leap over the previous generation high-end GPUs at the cost of a markedly higher TDP, and they will be very expensive of course. The bulk of the customers however will buy mid-range cards, and so will I provided they significantly outperform my current GTX 1080Ti FE, have a good cooling solution with a two-slot design, and a TDP of around 300 Watts. I'll have my eyes on the RTX 4070 series and my hope is some custom designs (or maybe an FE type of card) will be brought to market that fit the bill.
Posted on Reply
#49
thelawnet
chrcolukDoes America have the same energy crisis as UK? A guy mining stopped when he had a £500 month electric bill ($660).

If I ran the GPU in the UK without undervolt and no FPS cap, I think I would pay more in electric than buying the card in the first year. :)
That's just dumb miners being dumb.

The price of electricity is literally in every mining calculator.

As far as gamers go apparently UK electricity is up to 28p/kwh, which is insane, but even so at 600W, that's £20/month, gaming 4 hours a day with a £3000 (?) GPU.

Given that people give some stupid amount of money (north of £100/month) to watch Sky Sports, then even when you deliberately buy the 'massively overpriced halo edition' GPU, the electricity, even in post-Putin inflation hellscape, is not that expensive.
Posted on Reply
#50
Vayra86
nguyenJust stick to a TDP figure you are comfortable with and let others enjoy their 500W GPU :D (not me, I would just undervolt it down to ~300W), I'm sure the AD104/106/107 chips will be ultra efficient and more suitable for your gaming need.

Ada is just a die shrink of Ampere, so RT/Tensor perf would scale linearly with raster compare to Ampere.

There are AI supercomputers doing all the planet saving stuffs so in the end capitalism save itself from the problem it created :D. This would be a bit surprising to you but Capitalist countries have healthier air quality than non capitalist countries.
Yes because we export our industry to cheap labor countries. China...

You are missing quite a bit of context to justify cognitive dissonance. Another glaring one is the idea processing power somehow helps climate? All it really serves is understanding it, while slowly ruining it.

Also, lower SKUs arent all that efficient at all, the 104 also got its share of TDP bumps. Its a strange reading of facts you have here, buddy...

You make a great point when you say ADA is just shrunk ampere, that does explain quite well why we get what we seem to be getting. Their tech isnt advancing a whole lot anymore, we are in a moar corezzz situation here like I alluded to earlier as well. We'll see where that goes... gonna be interesting what AMD will do with RDNA but they then have the opportunity to gain feature parity with NV..
Posted on Reply
Add your own comment
Jun 3rd, 2024 12:30 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts