• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Regarding 6-pin PCIe rated wattage!

Status
Not open for further replies.
Joined
Mar 26, 2022
Messages
146 (0.12/day)
Location
Ballkans
When 6-pin PCIe first came out it was rated for 75w(20-22 AWG) but the difference of 6-pin to 8-pin is very similar in fact I see no reason why a 6 pin cannot provide 150w.
The connectors both 6/8-pin have only 3x12v lines, with the 8-pin having an extra ground and sense wire that provide "Stability".
I am fairly confident that by upgrading the wire gauge from (20 to 16/14 AWG) the use of a 6-pin to 8-pin cable should not be an issue!

Edit: Theoretically a (16awg) 6-pin PCIe cable can deliver 360w, each 12v line supplying 120w.


Thank you for your time!
 
Last edited by a moderator:
When 6-pin PCIe first came out it was rated for 75w(20-22 AWG) but the difference of 6-pin to 8-pin is very similar in fact I see no reason why a 6 pin cannot provide 150w.
The connectors both 6/8-pin have only 3x12v lines, with the 8-pin having an extra ground and sense wire that provide "Stability".
I am fairly confident that by upgrading the wire gauge from (20 to 16/14 AWG) the use of a 6-pin to 8-pin cable should not be an issue!


Thank you for your time!
It's not just the wire gauge but how the power supply outputs that current over those wires. Just because you increase the gauge doesn't mean the psu will put out more current, it has to be "programmed" in to do it.

Output from the pcie slot is still 75 Watts.

75W/12V=6.25A, so the rest has to come from the pcie cables.

So combined say gpu is 400 Watts, 75W comes from the PCIE slot, so that leaves 325W to come from two 8 Pin plugs, 325/2= 162.5W each

Pcie= 75W/12V=6.25A
8 Pin 1 162.5/12V= 13.542A
8 Pin 2 162.5/12V= 13.542A

Since there are three 12V Pins in a 8 Pin PCIE Plug they have to each handle 54.17 Watts/12V=4.51412A

That leaves 5 grounds

So 3 pins are Hot, 3 are Loop Ground, and 2 are a Chassis ground (ESD/Protection)
 
Last edited:
When 6-pin PCIe first came out it was rated for 75w(20-22 AWG) but the difference of 6-pin to 8-pin is very similar in fact I see no reason why a 6 pin cannot provide 150w.
The connectors both 6/8-pin have only 3x12v lines, with the 8-pin having an extra ground and sense wire that provide "Stability".
I am fairly confident that by upgrading the wire gauge from (20 to 16/14 AWG) the use of a 6-pin to 8-pin cable should not be an issue!


Thank you for your time!
They added more ground wires to stop them melting, or sending that power through the GPU to ground out of the slot.


I've melted PCI-E cables before, it's not very fun.
 
33.33 Amps is needed for that 400W GPU


And all it takes is 0.15 amps or greater to take a person out.

Fyi 526VDC feels exactly like 120VAC (touched wires attached to a Megger while in operation doh!)
 
33.33 Amps is needed for that 400W GPU


And all it takes is 0.15 amps or greater to take a person out.

Fyi 526VDC feels exactly like 120VAC (touched wires attached to a Megger while in operation doh!)
Fortunately you have to have high voltage to overcome skin resistance to get much amperage flowing.
 
It's not just the wire gauge but how the power supply outputs that current over those wires. Just because you increase the gauge doesn't mean the psu will put out more current, it has to be "programmed" in to do it.

Output from the pcie slot is still 75 Watts.

75W/12V=6.25A, so the rest has to come from the pcie cables.

So combined say gpu is 400 Watts, 75W comes from the PCIE slot, so that leaves 325W to come from two 8 Pin plugs, 325/2= 162.5W each

Pcie= 75W/12V=6.25A
8 Pin 1 162.5/12V= 13.542A
8 Pin 2 162.5/12V= 13.542A

Since there are three 12V Pins in a 8 Pin PCIE Plug they have to each handle 54.17 Watts/12V=4.51412A

That leaves 5 grounds

So 3 pins are Hot, 3 are Loop Ground, and 2 are a Chassis ground (ESD/Protection)
I know this is kinda old but I couldn't help but notice your mistake in the begining there

(QUOTE)
"Just because you increase the gauge doesn't mean the psu will put out more current, it has to be "programmed" in to do it."

The reason you upgrade the wires to in this case 16/14awg is to prevent those wires from melting and also the PSU is not programed to send a certain amount of power through a particular line (connector)!
If the conditions are right (the conductors will handle such wattage without damage) the PSU will send that current until it dies!!
There is no PSU programming involved, the connected component will pull the watts it needs and if the PSU won't be able to deliver the power the PSU will likely shut down to prevent damage or in cheaper units let out the magic smoke!
 
I'd imagine the lumped lower resistance of lower gauge wires would allow for more current flow with less voltage loss.

HWiNFO64 reports my 1080ti can pull up to 471 Watts at 1.2V Vcore and 2164 Mhz. core clock, it also reports the voltage at both 8-pin PCIe connectors drops as far as 11.536V while the +12V slot voltage remains nearly constant.
 
I'd imagine the lumped lower resistance of lower gauge wires would allow for more current flow with less voltage loss.

HWiNFO64 reports my 1080ti can pull up to 471 Watts at 1.2V Vcore and 2164 Mhz. core clock, it also reports the voltage at both 8-pin PCIe connectors drops as far as 11.536V while the +12V slot voltage remains nearly constant.
How many 8pin connectors does your 1080ti have?
 
There are also grounding slots in the PCIe slot connector.

For my 1080ti it's interesting that while the PCIe +12V 8-pins show voltage drops under load the +12V PCIe slot voltage never varies.
 
See here for the analysis. In the optimal senario 360Watts are possible but the guidelines are from 2004 and at that time inferior materials and wires were much more common.
If you have a modern AAA-quality PSU, that 75Watt rating for the 6-pin PCIe is very very conservative.
 
GPUS also get ground from the chassis when you screw it down via the bracket.
Yknow, NZXT's risers used non conductive screws as part of their fix for the "oops you're on fire" PCI-E risers


I wonder if we could find grounding issues with the GPU's mounting in the problem setups?

There are also grounding slots in the PCIe slot connector.

For my 1080ti it's interesting that while the PCIe +12V 8-pins show voltage drops under load the +12V PCIe slot voltage never varies.
You likely have a multi rail PSU, and of course it works that way - the PCI-E slot has a lot lower load
 
@Mussels
No I don't have a multi-rail PSU (it's a Rosewill Capstone 750W Gold PSU) according to the johnnyguru review I read. I'm guessing the PCB doesn't route the +12V power from the PCIe slot to the same circuitry components as the +12V 8-pin PCIe power connectors. My guess is that so much +12V current is being sourced across the PCIe supplementary power connectors the resistance of the cables themselves comes into play.
 
@Mussels
No I don't have a multi-rail PSU (it's a Rosewill Capstone 750W Gold PSU) according to the johnnyguru review I read. I'm guessing the PCB doesn't route the +12V power from the PCIe slot to the same circuitry components as the +12V 8-pin PCIe power connectors. My guess is that so much +12V current is being sourced across the PCIe supplementary power connectors the resistance of the cables themselves comes into play.
If it's truly single rail, then you're losing voltage from resistance along the way - extensions, connectors, adaptors, loose connections, or just downright shitty measurements. You never know where they placed them in the circuit and as load increases voltages always drop.
 
GPUS also get ground from the chassis when you screw it down via the bracket.
In PC world we refer to the Negative - as "Ground"
The chasis is connected to "Earth" ground which if this would happen ..... ground negative touch ground earth =House fire.
 
In PC world we refer to the Negative - as "Ground"
The chasis is connected to "Earth" ground which if this would happen ..... ground negative touch ground earth =House fire.
That's different between countries, I believe

Like how many US sockets dont have a ground pin
 
That's different between countries, I believe

Like how many US sockets dont have a ground pin
They actually do have a ground female insert although it is very primitive, I wish the entire world used the british wiring system where each plug must have its own fuse! (I live in Kosovo but am very familiar with american wiring)
Where are you from?
 
That's different between countries, I believe

Like how many US sockets dont have a ground pin
They actually do have a ground female insert although it is very primitive, I wish the entire world used the british wiring system where each plug must have its own fuse! (I live in Kosovo but am very familiar with american wiring)
Where are you from?

Earth ground on all sockets has been code in the US since ... the '60s? But many houses built before then (like my old AND current house!) have at least a few ground terminals that are simply not connected, because it's easy to drop in a 3-terminal outlet, but a PITA to run ground wires everywhere.
 
Like how many US sockets dont have a ground pin
New ones must. But old ones are not uncommon yeah. Have a house from before the 70s and you may find a few.
 
I almost had a home sale fall through because buyer didn't like the fact there were no grounds in the kitchen above the counters even though the house was built in '78. I added GFCI outlets and pointed out that very few kitchen appliances have grounds plugs for it to matter. I'd much rather have GFCI with no ground near water than ground without GFCI anyhow as a grounded device isn't going to trip a circuit breaker if it falls into water whereas pretty much any device will trip a GFCI if it's not waterproof.
 
You all kinda backed up my point with the whole
yes/no/maybe/heres alternatives

The USA has more home wiring options than two countries with different standards
 
I almost had a home sale fall through because buyer didn't like the fact there were no grounds in the kitchen above the counters even though the house was built in '78. I added GFCI outlets and pointed out that very few kitchen appliances have grounds plugs for it to matter. I'd much rather have GFCI with no ground near water than ground without GFCI anyhow as a grounded device isn't going to trip a circuit breaker if it falls into water whereas pretty much any device will trip a GFCI if it's not waterproof.

How does Ground Fault Circuit Interrupt work with no ground to fault?
 
How does Ground Fault Circuit Interrupt work with no ground to fault?
Watch 3:59 to 4:33

I'm not an electrician. I guess it senses hot is going to ground somehow, I don't know how.

Edit: I believe that the GFCI detects that there's slightly more current running through one conductor than the other and that's what causes it to trip. When you have a working device drawing power from the outlet it should cause the same amount of current to flow through neutral and hot sides when in operation so it doesn't trip the GFCI. Now if your device fell into water it would trip the GFCI if there is a path to ground as the hot wire would have more current flowing than neutral.
 
Last edited:
Status
Not open for further replies.
Back
Top