Tuesday, September 24th 2024

NVIDIA RTX 5090 "Blackwell" Could Feature Two 16-pin Power Connectors

NVIDIA CEO Jensen Huang never misses an opportunity to remind us that Moore's Law is cooked, and that future generations of logic hardware will only get larger and hotter, or hungrier for power. NVIDIA's next generation "Blackwell" graphics architecture promises to bring certain architecture-level performance/Watt improvements, coupled with the node-level performance/Watt improvements from the switch to the TSMC 4NP (4 nm-class) node. Even so, the GeForce RTX 5090, or the part that succeeds the current RTX 4090, will be a power hungry GPU, with rumors suggesting the need for two 16-pin power inputs.

TweakTown reports that the RTX 5090 could come with two 16-pin power connectors, which should give the card the theoretical ability to pull 1200 W (continuous). This doesn't mean that the GPU's total graphics power (TGP) is 1200 W, but a number close to or greater than 600 W, which calls for two of these connectors. Even if the TGP is exactly 600 W, NVIDIA would want to deploy two inputs, to spread the load among two connectors, and improve physical resilience of the connector. It's likely that both connectors will have 600 W input capability, so end-users don't mix up connectors should one of them be 600 W and the other keyed to 150 W or 300 W.
Above is a quick Photoshop job by TweakTown of how such a card could look like. The requirement of two 16-pin connectors should rule out older PSU types, and NVIDIA will likely only include one adapter that converts two or three 8-pin PCIe power connectors to a 16-pin, with the other input expected to be a native 600 W input from an ATX 3.0 or ATX 3.1 PSU. Most of the newer generation ATX 3.0 or ATX 3.1 PSUs in the market only have one native 16-pin connector, and three or four additional 8-pin PCIe power connectors. As for the connector itself, this could very likely be a 12V-2x6 with compatibility for 12VHPWR.

Some PSU manufacturers are beginning to release high-Wattage models with two native 12V-2x6 connectors. These would typically have a Wattage of over 1300 W. The Seasonic Prime PX-2200 W, released earlier this week, is an extreme example of this trend. Besides its high Wattage, this PSU puts out as many as four 12V-2x6 connectors. Another recent example would be the MSI MEG AI1600T PCIE5 (1600 W), with two native 600 W 12V-2x6.
Source: TweakTown
Add your own comment

110 Comments on NVIDIA RTX 5090 "Blackwell" Could Feature Two 16-pin Power Connectors

#26
arni-gx
"Some PSU manufacturers are beginning to release high-Wattage models with two native 12V-2x6 connectors. These would typically have a Wattage of over 1300 W. The Seasonic Prime PX-2200 W, released earlier this week, is an extreme example of this trend. Besides its high Wattage, this PSU puts out as many as four 12V-2x6 connectors. Another recent example would be the MSI MEG AI1600T PCIE5 (1600 W), with two native 600 W 12V-2x6."

if using that PSU 1300-2200w just for i9 gen 14-15 series/r9 9000-10000 series with rtx 5090 48gb, how much pc gamer should use and what UPS type, for handling all power from that PSU ??
Posted on Reply
#27
ErikG
Real RTX GPU connector.

Posted on Reply
#28
DBGT
Actually there is a water colled version of the 4090 that has two 16pin connectors, if I am not mistaken.
Posted on Reply
#29
usiname
pk67The reason is obvious - more power hungry GPU fit well with more phases of power supply and this fit well with higher input voltage.
Look how many phases we have on decent mobos these days.
The same trend will come to power hungry add on boards sooner or later imho like GPU or NPU discrete accelerators.

Demanding and aware users will prefer silent - well designed- power section than couple bucks cheaper but noisy and less reliable.
Especially in case they have to spend grand or more to get the card.
The GPUs already have as much power phases as the motherboard and no, 24V won't make anything less noisy, because you will need to convert the power anyway. The GPUs already hove conversion from 12V, switching to 24V will make it much bigger and expensive so you don't get anything.
Are you so spoiled to want single power connector to the GPU? Where were you the past 15 years when almost every X60 class card were equipped with 2 8 pin connecters and the high end card were sometimes with 3x8, sometimes even 4x8? The 16pin connector is smaller than single 8 pin power connector, so what is the problem?
Posted on Reply
#30
mtosev
The power requirements are crazy. I hope that the card is going to be a monster of a card when it comes to performance.
Posted on Reply
#31
pk67
JWNoctisArguably, bumping up the voltage worked for HVDC folks, and the automotive industry which tried to go 48V straight for a lot of the same reasons.

The way things are going, it might be necessary soon enough.
I see I'm not alone here at least.
Posted on Reply
#32
chrcoluk
Didnt Nvidia say the reason for these was to avoid multiple connectors? they need to get the power draw in check.

2030

"Nvidia have just announced their upcoming 7090 GPU, comes with 64pin connector to mean no more multiple cables, and even comes with its own 3000W PSU to ensure it has the power it needs. All for the great price $4000, The cooler has also been beefed to a 5 slot solution".

PSU vendors love Nvidia right now.
Posted on Reply
#34
bug
They can come with a dedicated power plant for all I care. What I'm looking for is a decent mid-range. Haven't seen that in years.
Posted on Reply
#35
usiname
AVATARAT
Actually it will make the GPUs much safer, because even if one connector fail, the other will take the load. The reasons for burning connectors is bad contact which lead all the power to go through few pins and they heat up
Posted on Reply
#36
pk67
usinameThe GPUs already have as much power phases as the motherboard and no, 24V won't make anything less noisy, because you will need to convert the power anyway. The GPUs already hove conversion from 12V, switching to 24V will make it much bigger and expensive so you don't get anything.
Are you so spoiled to want single power connector to the GPU? Where were you the past 15 years when almost every X60 class card were equipped with 2 8 pin connecters and the high end card were sometimes with 3x8, sometimes even 4x8? The 16pin connector is smaller than single 8 pin power connector, so what is the problem?
Alone 24V won't make anything less noisy - I agree but coupled with good culture of design can lead to silent, reliable and efficient power section.

If you still cant imagine why 12V dont help to achieve these goals try to think what would happend if you tried to supply GPU cards by 5V voltage like it were in vintage days.
Posted on Reply
#37
AusWolf
Who could have seen it coming? :rolleyes: /s

Posted on Reply
#38
usiname
pk67Alone 24V won't make anything less noisy - I agree but coupled with good culture of design can lead to silent, reliable and efficient power section.

If you still cant imagine why 12V dont help to achieve these goals try to think what would happend if you tried to supply GPU cards by 5V voltage like it were in vintage days.
You have the SAME power conversion, no matter if it is in the GPU or the PSU and if you hear noises from your GPU now, what will happen when you have even more power phases and heavy electricity components? I will tell you - even noisier GPU, If they are using trash components now, they won't use better with 24V. I trust much more the manufacturers of PSUs and I can spend few more bucks to get better power supply that will be quiet instead to spend more times, more money on top halo GPU, to get better power delivery.
By the way, I never heard noises from the electricity of the GPU, you know, even if there is, when the GPU is loaded the fans are at high RPM and mute that. What trashy GPU you have to hear such noises?

Edit: One more think, currently the PSUs have 3.3, 5 and 12V. Do you want to know what will happen when you add 24V output to the PSUs? They will become more expensive, complex than now and your favorite - noisier
Posted on Reply
#39
Pumper
Nvidia invents new connector that would eliminate the need for multiple power cables and then plans to release a GPU that needs multiple power cables.
Posted on Reply
#40
AVATARAT
usinameActually it will make the GPUs much safer, because even if one connector fail, the other will take the load. The reasons for burning connectors is bad contact which lead all the power to go through few pins and they heat up
It's just a meme.
Posted on Reply
#42
Prime2515102
Soon people will have to get a dedicated 20A/2400W circuit installed in their home just for their computer (15A/1800W is standard for residential in the US).

I guess if you can afford these top-end cards it wouldn't be a problem to get done though.
Posted on Reply
#43
Solid State Brain
Prime2515102Soon people will have to get a dedicated 20A/2400W circuit installed in their home just for their computer (15A/1800W is standard for residential in the US).

I guess if you can afford these top-end cards it wouldn't be a problem to get done though.
I already use my RTX3090 with a 250W limit for several reasons, one of them being that my uninterruptible power supply will begin beeping due to overload above 500W. Performance by decreasing it from the default 370W is only slightly affected. Der8auer also did some related testing with an RTX4090:




I expect the same to happen with this supposedly 600+W RTX5090. These high-end GPU don't really need to use that much power for great performance; there are commercial factors deciding that they just have to, primarily.
Posted on Reply
#44
Dammeron
chrcoluk2030

"Nvidia have just announced their upcoming 7090 GPU, comes with 64pin connector to mean no more multiple cables, and even comes with its own 3000W PSU to ensure it has the power it needs".
"And the PSU cable is hard-mounted to the card to remove possible user errors!"
Posted on Reply
#45
pk67
usinameYou have the SAME power conversion, no matter if it is in the GPU or the PSU and if you hear noises from your GPU now, what will happen when you have even more power phases and heavy electricity components? I will tell you - even noisier GPU, If they are using trash components now, they won't use better with 24V. I trust much more the manufacturers of PSUs and I can spend few more bucks to get better power supply that will be quiet instead to spend more times, more money on top halo GPU, to get better power delivery.
By the way, I never heard noises from the electricity of the GPU, you know, even if there is, when the GPU is loaded the fans are at high RPM and mute that. What trashy GPU you have to hear such noises?
Now i see you have completely no idea what you talking about.
It is not a matter of trust - it is matter of desing constraints.

Do you have tiniest piece of idea why PMIC is mounted on each DDR5 module instead of power supply or on mobo itself ?

Shortly speaking 12V solution is archaic* these days and still exists as a matter of industry inertion.

Higher input supply voltage will force designers to spread higher power budget per more physically installed phases cos of higher ratio of conversion.
Now we have 12V --> ~1V so 12:1 ratio . In 24V supply case we will have 24:1 ratio instead.
For the matter of desing culture it is a game changer.
But some folks like you are unable to get it I see.

edit
Archaic for 400W+ power budgets. For egpu's power budget 80-150W 12V power voltage level is still fine and will be forever.
Posted on Reply
#46
PLAfiller
I was not disappointed by the jokes in this thread :D :D

Nvidia can make efficient cards. Their RTX A2000 was phenomenal and now the RTX A4000 SFF is a little marvel out there, like a 3060ti @70W, fantastic efficiency.

Clearly they know how to create these gems.

I guess it's just not worth the effort plus the premium feeling you are gaming on a huge, gas-guzzler you can look through the glass and feel better about the money spent :P
Posted on Reply
#47
usiname
pk67Now i see you have completely no idea what you talking about.
It is not a matter of trust - it is matter of desing constraints.

Do you have tiniest piece of idea why PMIC is mounted on each DDR5 module instead of power supply or on mobo itself ?

Shortly speaking 12V solution is archaic these days and still exists as a matter of industry inertion.

Higher input supply voltage will force designers to spread higher power budget per more physically installed phases cos of higher ratio of conversion.
Now we have 12V --> ~1V so 12:1 ratio . In 24V supply case we will have 24:1 ratio instead.
For the matter of desing culture it is a game changer.
But some folks like you are unable to get it I see.
Strange to hear this from someone so clueless who can't answer basic questions, want the GPUs to be more expensive when he "hear" electricity noises from his cheap GPU and want to have single power connector. The problem is not in the GPUs and the 12V, the problem is in you
Posted on Reply
#48
Bwaze
Solid State BrainI already use my RTX3090 with a 250W limit for several reasons, one of them being that my uninterruptible power supply will begin beeping due to overload above 500W. Performance by decreasing it from the default 370W is only slightly affected. Der8auer also did some related testing with an RTX4090:




I expect the same to happen with this supposedly 600+W RTX5090. These high-end GPU don't really need to use that much power for great performance; there are commercial factors deciding that they just have to, primarily.
You do know there is not that much performance difference between RTX 4080 Super and RTX 4090 to just throw it away?

In latest reviews it's :

14.6% in 1080p (192 vs 230 FPS)
22% at 1440p (148 vs 280.4)
27.8% at 4K (89.2 vs 114)

By "going green" you might be throwing away half the reason you spent 1750 EUR instead of 1000 EUR?
Posted on Reply
#49
sephiroth117
We got 75W on PCI-E, even more with newer motherboards (with that extra PCI psu input)
We got 600W from the 12V2x6

How in hell are they going to use more than that in a consumer-grade GPU ?

If you are pumping that much Watts, it's the intel way, it means the efficiency innovation is not fast enough and you need to pump wattage up to mark a clear improvement with Ada Lovelace.

In that case IMHO, there's no point gaming on Blackwell and I'd wait for the RTX6000, tired of companies using 300W CPUs, up to 1200W GPU because they want to use cheaper TSMC nodes that are more power hungry

I sure hope it's either a Titan, a 5090Ti or some OC AIB cards...I'm sticking with one 12V2x6 for my update (I really need to change that RTX2060)
Posted on Reply
#50
Evrsr
Well well, some sanity on this one. If they are going to be pushing this much power, better to split it in two connectors.

The big issue here will also be bending. Much like the 30s didn't ever fail because the connector was angled, if this requires bending it will not be fault free but will be a step in the right direction.

Also, given the lack of card support we've seen on newer coolers and even backplates, coupled with this issue, I would recommend people only use these vertically mounted. Everything else is exceedingly problematic. They will keep breaking at the PCIe slot (or break the slot) and warranty will be denied, despite these being pretty much massive design flaws on cooler weight and proper support of it.
Posted on Reply
Add your own comment
Nov 24th, 2024 18:17 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts