• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA RTX 50-series "Blackwell" to Debut 16-pin PCIe Gen 6 Power Connector Standard

It was disproved by a more reputable source (I hope), but this just means that we can expect some high power consumption figures.

I guess nvidia is scared by Intels 400W CPU, so they need to assert their dominance.
 
Last edited:
I guess nvidia got scared by Intels 400W CPU, so they need to assert their dominance.
Sometimes they must do something known as a power move.
 
That's.... pretty far out there. Source? Or just conjecture
Just trust me bro. Jon and that other dude with the hard to spell name are just trying to shift the blame to nvidia with that sponsor nonsense.

Anyone can see from a mile that they(PCI-SIG, AMD, Intel) jebaited nvidia with the 12vhpwr
 
Just trust me bro. Jon and that other dude with the hard to spell name are just trying to shift the blame to nvidia with that sponsor nonsense.

Anyone can see from a mile that they(PCI-SIG, AMD, Intel) jebaited nvidia with the 12vhpwr
It's one way to interpret it, or nvidia was the only one wanting a tiny footprint on the power connector, and rushed to 12vhpwr. I doubt it was a conspiracy.
(Note: I know nothing about the timeline regarding 12vhpwr)
 
Theoretical maximums don't equate to real power consumption figures, it's well known the RX 7900 series can chug about as much power as a 4090. W1zz's even tested a graphene pad with the 7900 XTX at 475 W:
1. Right now i have to pay 0,3924 € per kWh. If the price difference of the cards (nVidia, AMD) is about 350€. You can calculate yourself how high the power gap between both cards needs to be that only the investment difference is equalized.

2. Also at Linux nVidia device drivers are available and working. They are just closed source in difference to nVidia.

3. What someone tests about anything doesn't bother me. Every tester will sell their soul if it brings them a benefit. As their job title says they try to influece you to buy a certain product. In my time it was named Marketing. In difference to them I decide on pure facts. No matter what an influencer trying to force me to.

In general it still keeps a fact that this 12VHPWR adapter is less reliable than the PCIe one. Right now the 12VHPWR plug is a piece of crap. That's the real and only reason why they introduced the modification. Also one faces big problems there when he needs to use small radiusses at the cable. They are not possible. Within the next four or five years that standard will become mature so that it will be reliable like the PCIe cable since years. But it isn't now. I'm also not that dumb early adopter. So they can get over to me after the adapter got mature. As easy as that.
 
Theoretical maximums don't equate to real power consumption figures, it's well known the RX 7900 series can chug about as much power as a 4090. W1zz's even tested a graphene pad with the 7900 XTX at 475 W:

That is summarily wrong. You are talking about OC.

Just trust me bro. Jon and that other dude with the hard to spell name are just trying to shift the blame to nvidia with that sponsor nonsense.

Anyone can see from a mile that they(PCI-SIG, AMD, Intel) jebaited nvidia with the 12vhpwr
That has to be the most narrative based argument I have seen. All of the PSU makers changed their PSUs to 3.0 because of Intel and AMD? Give me a break.
 
Theoretical maximums don't equate to real power consumption figures, it's well known the RX 7900 series can chug about as much power as a 4090. W1zz's even tested a graphene pad with the 7900 XTX at 475 W:
The Asrock Taichi 7900XTX average power consumption in gaming is 419 watts, it only goes higher with an OC or during power spikes.
I find the whole power consumption argument interesting as Nvidia users didn't seem to care with the RTX 3000 series. The difference in 350 vs 400 watts isn't going to show up on a power bill.
Unlikely IMO and the only reason 7900 XTX didn't have them is that the hardware design was finalized by the point those began to rollout. Other cards just rode the wave primarily as a publicity stunt "hey look ours don't blow up"
The RTX 3090 and 3090Ti has the 12 pin connector, unless the 7900XTX series was still being designed, AMD chose not to use it for whatever reason.
There are other threads here showing some RTX 40 series PCB's having the pin pads for 8 pin connectors, although with how Nvidia forces their AIB's to design their cards I don't doubt Nvidia forced AIB's to use the 12 pin connector.
 
Last edited:
I will be taking a AMD GPU for my next rig. Just smiling about NVidia. How much tries will they need to have a new solid connector.AMD needs the same power and still uses the old reliable connector.
I don't blame you. I have a 4090 with one of the original 12VHPWR ( the one before the update) and... it does make me a little nervous considering there's not nearly enough room in my case to make the cable as straight as they say it should be. At least I undervolt and limit frames so there's not too much power going through there at any one time.

Anyway, I posted on the nvidia subreddit a while ago, to point out a few of the flaws with the new connector, and man the people there did not like that one bit. Like come on guys, I bought a 4090, I have to absolutely love this new connector to show my loyalty to the brand or something? Can I not just, buy some of the products while also criticizing the bits I don't like?

People seem to take this stance where its like "well my 4090 didn't melt so I guess that means no 4090s melt unless the user does something wrong" I know that was the prevailing narrative for a while but the fact that the updated 12V-2x6 connectors continue to melt kind of throws that into doubt.
 
It's one way to interpret it, or nvidia was the only one wanting a tiny footprint on the power connector, and rushed to 12vhpwr. I doubt it was a conspiracy.
(Note: I know nothing about the timeline regarding 12vhpwr)
Some nvidia haters said they used it so they can make smaller PCB, since copper isn't cheap.
That is summarily wrong. You are talking about OC.


That has to be the most narrative based argument I have seen. All of the PSU makers changed their PSUs to 3.0 because of Intel and AMD? Give me a break.
ATX 3.0 PSU is not required to include a 12+4 pin (or 16-pin) ATX 12VHPWR connector, see Corsair HXi(2023) RMx Shift.
 
Some nvidia haters said they used it so they can make smaller PCB, since copper isn't cheap.

ATX 3.0 PSU is not required to include a 12+4 pin (or 16-pin) ATX 12VHPWR connector, see Corsair HXi(2023) RMx Shift.
Are you serious? A PSU released on Jan 23, 2023. That is also a PSU that is a special edition PSU. The point remains that it was not AMD or Intel that influenced PSU makers to include a connection that none of their products support. That notion is preposterous.
 
Corsair PSU's are special edition?
You posted a PSU that is the only one configured like that. At this year's CES we saw Cases that now support that PSU.
 
Just trust me bro. Jon and that other dude with the hard to spell name are just trying to shift the blame to nvidia with that sponsor nonsense.

Anyone can see from a mile that they(PCI-SIG, AMD, Intel) jebaited nvidia with the 12vhpwr
Yeah no. Source or BS.
 
The Asrock Taichi 7900XTX average power consumption in gaming is 419 watts, it only goes higher with an OC or during power spikes.
I find the whole power consumption argument interesting as Nvidia users didn't seem to care with the RTX 3000 series. The difference in 350 vs 400 watts isn't going to show up on a power bill.

The RTX 3090 and 3090Ti has the 12 pin connector, unless the 7900XTX series was still being designed, AMD chose not to use it for whatever reason.
There are other threads here showing some RTX 40 series PCB's having the pin pads for 8 pin connectors, although with how Nvidia forces their AIB's to design their cards I don't doubt Nvidia forced AIB's to use the 12 pin connector.

Only the RTX 3090 Ti has the 12VHPWR connector, the original 3090 uses the traditional 8-pin PCIe connector. The one I had (ASUS TUF OC) was a dual 8-pin "standard power" type and had a maximum PL of 375 W (107%) with the slider maxed out. It's obviously not enough for a 3090 so it was very throttle happy, as with all other standard power 3090's, really.

But you proved my point about the power efficiency: 419 W average gaming load correlates to... the absolute maximum that my Strix OC 4080 (basically this is the 4080 with the second highest power limit available, other than the exotic Galax OC Lab with dual 12VHPWR connectors) is even allowed to pull with the power limit slider maxed out. It's a lot of energy, and an amount that no 4080/S is going to use regardless of workload unless you're doing some pretty extreme overclocking. It might not show immediately on the power bill (unless you're based in Europe where every kWh is counting, apparently), but it sure does a number on your thermals and that has implications of their own, where is this heat being dumped? Your case? Your room? More work for your AC if applicable? ;)
 
Only the RTX 3090 Ti has the 12VHPWR connector, the original 3090 uses the traditional 8-pin PCIe connector. The one I had (ASUS TUF OC) was a dual 8-pin "standard power" type and had a maximum PL of 375 W (107%) with the slider maxed out. It's obviously not enough for a 3090 so it was very throttle happy, as with all other standard power 3090's, really.
The 3090Ti had the 12VHPWR connector, correct. The rest of the 3000-series FEs (bar the 3050) used Nvidia's own early 12-pin connector, without the 4 sense pins introduced by the 12VHPWR standard.
 
The 3090Ti had the 12VHPWR connector, correct. The rest of the 3000-series FEs (bar the 3050) used Nvidia's own early 12-pin connector, without the 4 sense pins introduced by the 12VHPWR standard.

Right, I forgot for a moment that there's the FE cards, but those connectors were only used on the FE's and not in any other design. With FE availability being what it is and all...
 
You posted a PSU that is the only one configured like that. At this year's CES we saw Cases that now support that PSU.
I posted two, please reread. HXi(2023) and RMx Shift, but there's the RMe too and the unreleased RMx(2024). HXi and RMe are not that special.
 
The evolution of the connector, even toilet paper is evolving. It's like that with everything.
 
There is much more important things in the world than 12VHPWR connector.
i hope u guys can even sleeps because the huge connector stress...
Maybe go out sometimes? and leave computer inside.
 
If the story is likely to be false then edit the subject title?

I only recently learnt that Nvidia started quietly releasing new GPUs with a revised connector. So the issues swept under a rug for existing owners of the beta product.

Right, I forgot for a moment that there's the FE cards, but those connectors were only used on the FE's and not in any other design. With FE availability being what it is and all...

The FE's also had less stress on bending the cables as the connector itself is angled on the GPU.
 
The FE's also had less stress on bending the cables as the connector itself is angled on the GPU.
Wasn't the angled connector for 3070Ti, 3080, 3080Ti and 3090? That still leaves the 3070 and below with the connector straight.
 
I posted two, please reread. HXi(2023) and RMx Shift, but there's the RMe too and the unreleased RMx(2024). HXi and RMe are not that special.
Let's get back to reality you posted that AMD and Intel are responsible for Nvidia adopting 12vHPWR. It does not matter as you are hard pressed today to fins a PSU that does not support ATX 3.0 with that special connection that is a new release. Why do people act like Nvidia can do no wrong?
 
Nvidia chose the 12VHPWR as it suited their design goals for the Blow through cooler deisgn on the FE cards. Trying to include 3x8 Pin would make the PCB a LOT larger than intended and would impact thermals/noise negatively as they would have to make the rear fan a lot smaller to take into account the extra PCB length.
 
LOL and the people telling me SLI used too much power, Ha Ha Ha, It's starting to look like the Voodoo 5 6000 all over again when you need a power supply for the video card and another one for the computer. Honestly just make it 3 connections with the 6 Pin PCI-E. Would be a much better way 3 way load on power supply and card instead of one connector which get's very hot and melts sometimes. Balance load over 3 PCI-E connectors would be less heat and less prone to failing.
 
Anyway, I posted on the nvidia subreddit a while ago, to point out a few of the flaws with the new connector, and man the people there did not like that one bit. Like come on guys, I bought a 4090, I have to absolutely love this new connector to show my loyalty to the brand or something? Can I not just, buy some of the products while also criticizing the bits I don't like?
Well tbf that's true for all brand loyalists especially on reddit, that's probably the worst of them! I'm on a few subreddits & it's like they're filled with 10-year-old kids :shadedshu:

This is why old school (tech) forums are much better for a slightly more nuanced debate, at least you can deal with more adults here.
 
Back
Top