Monday, February 19th 2024

NVIDIA RTX 50-series "Blackwell" to Debut 16-pin PCIe Gen 6 Power Connector Standard

NVIDIA is reportedly looking to change the power connector standard for the fourth successive time in a span of three years, with its upcoming GeForce RTX 50-series "Blackwell" GPUs, Moore's Law is Dead reports. NVIDIA began its post 8-pin PCIe journey with the 12-pin Molex MicroFit connector for the GeForce RTX 3080 and RTX 3090 Founders Edition cards. The RTX 3090 Ti would go on to standardize the 12VHPWR connector, which the company would debut across a wider section of its GeForce RTX 40-series "Ada" product stack (all SKUs with TGP of over 200 W). In the face of rising complains of the reliability of 12VHPWR, some partner RTX 40-series cards are beginning to implement the pin-compatible but sturdier 12V-2x6. The implementation of the 16-pin PCIe Gen 6 connector would be the fourth power connector change, if the rumors are true. A different source says that rival AMD has no plans to change from the classic 8-pin PCIe power connectors.

Update 15:48 UTC: Our friends at Hardware Busters have reliable sources in the power supply industry with equal access to the PCIe CEM specification as NVIDIA, and say that the story of NVIDIA adopting a new power connector with "Blackwell" is likely false. NVIDIA is expected to debut the new GPU series toward the end of 2024, and if a new power connector was in the offing, by now the power supply industry would have some clue. It doesn't. Read more about this in the Hardware Busters article in the source link below.

Update Feb 20th: In an earlier version of the article, it was incorrectly reported that the "16-pin connector" is fundamentally different from the current 12V-2x6, with 16 pins dedicated to power delivery. We have since been corrected by Moore's Law is Dead, that it is in fact the same 12V-2x6, but with an updated PCIe 6.0 CEM specification.
Sources: Moore's Law is Dead, Hardware Busters
Add your own comment

106 Comments on NVIDIA RTX 50-series "Blackwell" to Debut 16-pin PCIe Gen 6 Power Connector Standard

#76
wNotyarD
theoutoI guess nvidia got scared by Intels 400W CPU, so they need to assert their dominance.
Sometimes they must do something known as a power move.
Posted on Reply
#77
Knight47
Vayra86That's.... pretty far out there. Source? Or just conjecture
Just trust me bro. Jon and that other dude with the hard to spell name are just trying to shift the blame to nvidia with that sponsor nonsense.

Anyone can see from a mile that they(PCI-SIG, AMD, Intel) jebaited nvidia with the 12vhpwr
Posted on Reply
#78
theouto
Knight47Just trust me bro. Jon and that other dude with the hard to spell name are just trying to shift the blame to nvidia with that sponsor nonsense.

Anyone can see from a mile that they(PCI-SIG, AMD, Intel) jebaited nvidia with the 12vhpwr
It's one way to interpret it, or nvidia was the only one wanting a tiny footprint on the power connector, and rushed to 12vhpwr. I doubt it was a conspiracy.
(Note: I know nothing about the timeline regarding 12vhpwr)
Posted on Reply
#79
gurusmi
Dr. DroTheoretical maximums don't equate to real power consumption figures, it's well known the RX 7900 series can chug about as much power as a 4090. W1zz's even tested a graphene pad with the 7900 XTX at 475 W:
1. Right now i have to pay 0,3924 € per kWh. If the price difference of the cards (nVidia, AMD) is about 350€. You can calculate yourself how high the power gap between both cards needs to be that only the investment difference is equalized.

2. Also at Linux nVidia device drivers are available and working. They are just closed source in difference to nVidia.

3. What someone tests about anything doesn't bother me. Every tester will sell their soul if it brings them a benefit. As their job title says they try to influece you to buy a certain product. In my time it was named Marketing. In difference to them I decide on pure facts. No matter what an influencer trying to force me to.

In general it still keeps a fact that this 12VHPWR adapter is less reliable than the PCIe one. Right now the 12VHPWR plug is a piece of crap. That's the real and only reason why they introduced the modification. Also one faces big problems there when he needs to use small radiusses at the cable. They are not possible. Within the next four or five years that standard will become mature so that it will be reliable like the PCIe cable since years. But it isn't now. I'm also not that dumb early adopter. So they can get over to me after the adapter got mature. As easy as that.
Posted on Reply
#80
kapone32
Dr. DroTheoretical maximums don't equate to real power consumption figures, it's well known the RX 7900 series can chug about as much power as a 4090. W1zz's even tested a graphene pad with the 7900 XTX at 475 W:

www.techpowerup.com/review/thermal-grizzly-kryosheet-amd-gpu/
That is summarily wrong. You are talking about OC.
Knight47Just trust me bro. Jon and that other dude with the hard to spell name are just trying to shift the blame to nvidia with that sponsor nonsense.

Anyone can see from a mile that they(PCI-SIG, AMD, Intel) jebaited nvidia with the 12vhpwr
That has to be the most narrative based argument I have seen. All of the PSU makers changed their PSUs to 3.0 because of Intel and AMD? Give me a break.
Posted on Reply
#81
Hecate91
Dr. DroTheoretical maximums don't equate to real power consumption figures, it's well known the RX 7900 series can chug about as much power as a 4090. W1zz's even tested a graphene pad with the 7900 XTX at 475 W:
The Asrock Taichi 7900XTX average power consumption in gaming is 419 watts, it only goes higher with an OC or during power spikes.
I find the whole power consumption argument interesting as Nvidia users didn't seem to care with the RTX 3000 series. The difference in 350 vs 400 watts isn't going to show up on a power bill.
www.techpowerup.com/review/asrock-radeon-rx-7900-xtx-taichi/37.html
Dr. DroUnlikely IMO and the only reason 7900 XTX didn't have them is that the hardware design was finalized by the point those began to rollout. Other cards just rode the wave primarily as a publicity stunt "hey look ours don't blow up"
The RTX 3090 and 3090Ti has the 12 pin connector, unless the 7900XTX series was still being designed, AMD chose not to use it for whatever reason.
There are other threads here showing some RTX 40 series PCB's having the pin pads for 8 pin connectors, although with how Nvidia forces their AIB's to design their cards I don't doubt Nvidia forced AIB's to use the 12 pin connector.
Posted on Reply
#82
iameatingjam
gurusmiI will be taking a AMD GPU for my next rig. Just smiling about NVidia. How much tries will they need to have a new solid connector.AMD needs the same power and still uses the old reliable connector.
I don't blame you. I have a 4090 with one of the original 12VHPWR ( the one before the update) and... it does make me a little nervous considering there's not nearly enough room in my case to make the cable as straight as they say it should be. At least I undervolt and limit frames so there's not too much power going through there at any one time.

Anyway, I posted on the nvidia subreddit a while ago, to point out a few of the flaws with the new connector, and man the people there did not like that one bit. Like come on guys, I bought a 4090, I have to absolutely love this new connector to show my loyalty to the brand or something? Can I not just, buy some of the products while also criticizing the bits I don't like?

People seem to take this stance where its like "well my 4090 didn't melt so I guess that means no 4090s melt unless the user does something wrong" I know that was the prevailing narrative for a while but the fact that the updated 12V-2x6 connectors continue to melt kind of throws that into doubt.
Posted on Reply
#83
Knight47
theoutoIt's one way to interpret it, or nvidia was the only one wanting a tiny footprint on the power connector, and rushed to 12vhpwr. I doubt it was a conspiracy.
(Note: I know nothing about the timeline regarding 12vhpwr)
Some nvidia haters said they used it so they can make smaller PCB, since copper isn't cheap.
kapone32That is summarily wrong. You are talking about OC.


That has to be the most narrative based argument I have seen. All of the PSU makers changed their PSUs to 3.0 because of Intel and AMD? Give me a break.
ATX 3.0 PSU is not required to include a 12+4 pin (or 16-pin) ATX 12VHPWR connector, see Corsair HXi(2023) RMx Shift.
Posted on Reply
#84
kapone32
Knight47Some nvidia haters said they used it so they can make smaller PCB, since copper isn't cheap.

ATX 3.0 PSU is not required to include a 12+4 pin (or 16-pin) ATX 12VHPWR connector, see Corsair HXi(2023) RMx Shift.
Are you serious? A PSU released on Jan 23, 2023. That is also a PSU that is a special edition PSU. The point remains that it was not AMD or Intel that influenced PSU makers to include a connection that none of their products support. That notion is preposterous.
Posted on Reply
#85
Knight47
kapone32That is also a PSU that is a special edition PSU.
Corsair PSU's are special edition?
Posted on Reply
#86
kapone32
Knight47Corsair PSU's are special edition?
You posted a PSU that is the only one configured like that. At this year's CES we saw Cases that now support that PSU.
Posted on Reply
#87
Vayra86
Knight47Just trust me bro. Jon and that other dude with the hard to spell name are just trying to shift the blame to nvidia with that sponsor nonsense.

Anyone can see from a mile that they(PCI-SIG, AMD, Intel) jebaited nvidia with the 12vhpwr
Yeah no. Source or BS.
Posted on Reply
#88
Dr. Dro
Hecate91The Asrock Taichi 7900XTX average power consumption in gaming is 419 watts, it only goes higher with an OC or during power spikes.
I find the whole power consumption argument interesting as Nvidia users didn't seem to care with the RTX 3000 series. The difference in 350 vs 400 watts isn't going to show up on a power bill.
www.techpowerup.com/review/asrock-radeon-rx-7900-xtx-taichi/37.html

The RTX 3090 and 3090Ti has the 12 pin connector, unless the 7900XTX series was still being designed, AMD chose not to use it for whatever reason.
There are other threads here showing some RTX 40 series PCB's having the pin pads for 8 pin connectors, although with how Nvidia forces their AIB's to design their cards I don't doubt Nvidia forced AIB's to use the 12 pin connector.
Only the RTX 3090 Ti has the 12VHPWR connector, the original 3090 uses the traditional 8-pin PCIe connector. The one I had (ASUS TUF OC) was a dual 8-pin "standard power" type and had a maximum PL of 375 W (107%) with the slider maxed out. It's obviously not enough for a 3090 so it was very throttle happy, as with all other standard power 3090's, really.

But you proved my point about the power efficiency: 419 W average gaming load correlates to... the absolute maximum that my Strix OC 4080 (basically this is the 4080 with the second highest power limit available, other than the exotic Galax OC Lab with dual 12VHPWR connectors) is even allowed to pull with the power limit slider maxed out. It's a lot of energy, and an amount that no 4080/S is going to use regardless of workload unless you're doing some pretty extreme overclocking. It might not show immediately on the power bill (unless you're based in Europewhere every kWh is counting, apparently), but it sure does a number on your thermals and that has implications of their own, where is this heat being dumped? Your case? Your room? More work for your AC if applicable? ;)
Posted on Reply
#89
wNotyarD
Dr. DroOnly the RTX 3090 Ti has the 12VHPWR connector, the original 3090 uses the traditional 8-pin PCIe connector. The one I had (ASUS TUF OC) was a dual 8-pin "standard power" type and had a maximum PL of 375 W (107%) with the slider maxed out. It's obviously not enough for a 3090 so it was very throttle happy, as with all other standard power 3090's, really.
The 3090Ti had the 12VHPWR connector, correct. The rest of the 3000-series FEs (bar the 3050) used Nvidia's own early 12-pin connector, without the 4 sense pins introduced by the 12VHPWR standard.
Posted on Reply
#90
Dr. Dro
wNotyarDThe 3090Ti had the 12VHPWR connector, correct. The rest of the 3000-series FEs (bar the 3050) used Nvidia's own early 12-pin connector, without the 4 sense pins introduced by the 12VHPWR standard.
Right, I forgot for a moment that there's the FE cards, but those connectors were only used on the FE's and not in any other design. With FE availability being what it is and all...
Posted on Reply
#91
Knight47
kapone32You posted a PSU that is the only one configured like that. At this year's CES we saw Cases that now support that PSU.
I posted two, please reread. HXi(2023) and RMx Shift, but there's the RMe too and the unreleased RMx(2024). HXi and RMe are not that special.
Posted on Reply
#92
Han44
The evolution of the connector, even toilet paper is evolving. It's like that with everything.
Posted on Reply
#93
Dawora
There is much more important things in the world than 12VHPWR connector.
i hope u guys can even sleeps because the huge connector stress...
Maybe go out sometimes? and leave computer inside.
Posted on Reply
#94
chrcoluk
If the story is likely to be false then edit the subject title?

I only recently learnt that Nvidia started quietly releasing new GPUs with a revised connector. So the issues swept under a rug for existing owners of the beta product.
Dr. DroRight, I forgot for a moment that there's the FE cards, but those connectors were only used on the FE's and not in any other design. With FE availability being what it is and all...
The FE's also had less stress on bending the cables as the connector itself is angled on the GPU.
Posted on Reply
#95
wNotyarD
chrcolukThe FE's also had less stress on bending the cables as the connector itself is angled on the GPU.
Wasn't the angled connector for 3070Ti, 3080, 3080Ti and 3090? That still leaves the 3070 and below with the connector straight.
Posted on Reply
#96
kapone32
Knight47I posted two, please reread. HXi(2023) and RMx Shift, but there's the RMe too and the unreleased RMx(2024). HXi and RMe are not that special.
Let's get back to reality you posted that AMD and Intel are responsible for Nvidia adopting 12vHPWR. It does not matter as you are hard pressed today to fins a PSU that does not support ATX 3.0 with that special connection that is a new release. Why do people act like Nvidia can do no wrong?
Posted on Reply
#97
Panther_Seraphin
Nvidia chose the 12VHPWR as it suited their design goals for the Blow through cooler deisgn on the FE cards. Trying to include 3x8 Pin would make the PCB a LOT larger than intended and would impact thermals/noise negatively as they would have to make the rear fan a lot smaller to take into account the extra PCB length.
Posted on Reply
#98
Lycanwolfen
LOL and the people telling me SLI used too much power, Ha Ha Ha, It's starting to look like the Voodoo 5 6000 all over again when you need a power supply for the video card and another one for the computer. Honestly just make it 3 connections with the 6 Pin PCI-E. Would be a much better way 3 way load on power supply and card instead of one connector which get's very hot and melts sometimes. Balance load over 3 PCI-E connectors would be less heat and less prone to failing.
Posted on Reply
#99
R0H1T
iameatingjamAnyway, I posted on the nvidia subreddit a while ago, to point out a few of the flaws with the new connector, and man the people there did not like that one bit. Like come on guys, I bought a 4090, I have to absolutely love this new connector to show my loyalty to the brand or something? Can I not just, buy some of the products while also criticizing the bits I don't like?
Well tbf that's true for all brand loyalists especially on reddit, that's probably the worst of them! I'm on a few subreddits & it's like they're filled with 10-year-old kids :shadedshu:

This is why old school (tech) forums are much better for a slightly more nuanced debate, at least you can deal with more adults here.
Posted on Reply
#100
Prima.Vera
400W CPU + 800W GPU ... Future looks good /s
LycanwolfenLOL and the people telling me SLI used too much power, Ha Ha Ha, It's starting to look like the Voodoo 5 6000 all over again when you need a power supply for the video card and another one for the computer. Honestly just make it 3 connections with the 6 Pin PCI-E. Would be a much better way 3 way load on power supply and card instead of one connector which get's very hot and melts sometimes. Balance load over 3 PCI-E connectors would be less heat and less prone to failing.
Ahmmm, NO.
Posted on Reply
Add your own comment
May 21st, 2024 01:26 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts