Monday, October 11th 2021

PCIe Gen5 "12VHPWR" Connector to Deliver Up to 600 Watts of Power for Next-Generation Graphics Cards

The upcoming graphics cards based on PCIe Gen5 standard will utilize the latest PCIe connector with double bandwidth of the previous Gen4 that we use today and bring a new power connector that the next generation of GPUs brings. According to the information exclusive to Igor's Lab, the new connector will be called the 12VHPWR and will carry as many as 16 pins with it. The reason it is called 12VHPWR is that it features 12 pins for power, while the remaining four are signal transmission connectors to coordinate the delivery. This power connector is supposed to carry as much as 600 Watts of power with its 16 pins.

The new 12VHPWR connector should work exclusively with PCIe Gen5 graphics cards and not be backward compatible with anything else. It is said to replace three standard 8-pin power connectors found on some high-end graphics cards and will likely result in power supply manufacturers adopting the new standard. The official PCI-SIG specification defines each pin capable of sustaining up to 9.2 Amps, translating to a total of 55.2 Amps at 12 Volts. Theoretically, this translates to 662 Watts; however, Igor's Lab notes that the connector is limited to 600 Watts. Additionally, the 12VHPWR connector power pins have a 3.00 mm pitch, while the contacts in a legacy 2×3 (6-pin) and 2×4 (8-pin) connector lie on a larger 4.20 mm pitch.
There are already implementations of this connector, and one comes from Amphenol ICC. The company has designed a 12VHPWR connector and listed it ready for sale. You can check that out on the company website.
Source: Igor's Lab
Add your own comment

97 Comments on PCIe Gen5 "12VHPWR" Connector to Deliver Up to 600 Watts of Power for Next-Generation Graphics Cards

#26
AVATARAT
lynx29All of it confuses me. So will we all need to buy new PSU's for these cards or just use adapters?
If you already have 1kw PSU, then no, so probably you won't need :laugh:
RichardsAny gpu power consumption of 500 watts should be banned.. you d'not need that much performance
If this happens, then Nvidia will lose performance crown and probably will die :laugh:
Posted on Reply
#27
Chomiq
ZoneDymoso this is not the same as that new standard Nvidia is trying to push?

and also....great....lets just embrace gpu's using more and more power....
This is rumored to be used by the 3090 Super/Ti.
Posted on Reply
#28
qubit
Overclocked quantum bit
Bomby569600W just for a GPU is insane
But great for starting a fire lol. :laugh:
Posted on Reply
#29
ShurikN
KhonjelWhy?! Why only high wattage ones I mean. Replace 6-pin with this as well. Are we in the dawn of two different PCI-SIG power connectors? 6-pin for low power and 12-pin for high power. I know there's 6-pin and 8-pin today as well but 8-pins are mostly 6+2 pins so they are compatible.
I agree, if they want to go this 12 pin route, then go all the way. The 6+2/8 pin standard is old AF anyway.
Posted on Reply
#30
BSim500
MentalAcetylideWell unless there's some kind of breakthrough in GPU tech, there's really no other route for them to go besides "bigger" and "moar wattage".
Or we could go back to having games devs relearn how to optimise games better (ie, "a rising tide lifts all boats")? Let's be honest, the RTX 3090 problems with New World reveal GPU manufacturers already can't build 350-450w GPU's without them exploding due to inadequate power regulation circuitry. God help anyone who wants the same design engineers to scale that up to 600w...
Posted on Reply
#32
Space Lynx
Astronaut
ShurikNI agree, if they want to go this 12 pin route, then go all the way. The 6+2/8 pin standard is old AF anyway.
If PSU companies move forward in unison to fix this, I say great, but we also need mobo manufacturers to move forward on creating a new plug and play standard for cases and mobos for LED lights, power buttons, etc. The only part of building PC's I have truly hated over the years was those tiny tiny little wires that go into the mobo from the PC case... sigh...

@TheLostSwede call your friends to have them call their friends, and get all the big bosses in a room together. Unite the clans! Create a new mobo and PC case standard to make life easier! This moment is now your existence! What will you do?!?! UNITE THE CLANS!!! CALL THEM!

only 51 seconds long, watch it brothers!!!

NOW IS OUR CHANCE! NOW!!! IF WE WIN, WE WILL HAVE WHAT NONE OF US HAVE EVER HAD BEFORE BROTHERS! THE TINY WIRES WILL BE GONE!!!

@TheLostSwede @W1zzard UNITE THE CLANS!!!

:rockout::rockout::rockout::rockout::rockout:
Posted on Reply
#33
metazack
BIgger is better?Why is the GPU market becoming more like the car industry before the 70s Oil Crisis? just introduce a 1000W GPU then ,why stop at 600W?
Posted on Reply
#34
P4-630
Good for PSU manufacturers, they can sell them new PSU's....
Posted on Reply
#35
Bomby569
this is 675W just for the GPU, assuming you wouldn't pair this with a cheap CPU, let's go with another 250W for the CPU, some ssd's, some rainbow puke, lot's fans, this is a 1300W PC.
Posted on Reply
#36
Jism
God,

It means that the specification allows for up to 600W.

Does'nt mean GPU's will use 600w. It's more interesting for enterprise as compute cards like MCM based ones could consume that.
Posted on Reply
#37
cst1992
lynx29Create a new mobo and PC case standard to make life easier! This moment is now your existence!
xkcd.com/927/
Posted on Reply
#38
rainxh11
while i don't really care how much power a card needs if it's that powerful
i'm terrified from the thermal that a gpu that consumes 600W will produce
this is a mini thermonuclear reactor
there's nothing i know of that will be able to cool this thing
Posted on Reply
#39
dgianstefani
TPU Proofreader
Maybe instead of everyone assuming that because the connector supports up to 600w - all cards will now be 600w cards we can simply be thankful for improvements.

I personally love having a single custom sleeved 12pin cable for my GPU, it's great for tidiness in my SFF build.

Stop being old people afraid of change. This is a genuine improvement.
Posted on Reply
#40
Bomby569
dgianstefaniMaybe instead of everyone assuming that because the connector supports up to 600w - all cards will now be 600w cards we can simply be thankful for improvements.

I personally love having a single custom sleeved 12pin cable for my GPU, it's great for tidiness in my SFF build.

Stop being old people afraid of change. This is a genuine improvement.
The problem is that we all have perfectly good PSU's and there is no need to create e-waste especially at this time or a new shortage if everyone goes to buy new PSU's. Having one cable instead of 2 isn't improvement, it's wasting money.
Anyway this is clearly to answer the new upcoming cards because we can clearly see an insane increase in power demand from them, not cable management
Posted on Reply
#41
dgianstefani
TPU Proofreader
You don't need a whole new PSU, use your brain. Exactly the same thing happened with RTX3xxx GPUs, a simple adapter was included in the GPU - problem solved. Old PSU use adapter, new ones use simpler single connection at both ends.
Posted on Reply
#42
illusion archives
Well, 600W is surely at an insane level. But, if we can use it to feed a GPU perfoming like 4*RTX3090, that wil not be so bad.
Considering what AMD does between RDNA1 and RDNA2, the RX6900XT just uses 133% power compareing to RX5700XT to get 200% perfomance in 200% shading units.
What is truely destructive to enviorment is that N&A&I may be "forced“ to push the gpu&vram clock too high so get a terrible energy efficiency radio due to the need to make "The best card for that budget".
Posted on Reply
#43
ZoneDymo
dgianstefaniMaybe instead of everyone assuming that because the connector supports up to 600w - all cards will now be 600w cards we can simply be thankful for improvements.

I personally love having a single custom sleeved 12pin cable for my GPU, it's great for tidiness in my SFF build.

Stop being old people afraid of change. This is a genuine improvement.
your only positive is....the look of a pc (cable)....which in of itself is already baffeling but then you turn it into us being afraid of change? could you atleast read what people are saying before responding in such a..... way?
Posted on Reply
#44
P4-630
dgianstefaniYou don't need a whole new PSU, use your brain. Exactly the same thing happened with RTX3xxx GPUs, a simple adapter was included in the GPU - problem solved. Old PSU use adapter, new ones use simpler single connection at both ends.
How much can one six pin deliver with current PSU's? I thought it was just 75 Watts?.... And one 8 pin was 150 Watts?....
How simple could that adapter be?...
Posted on Reply
#45
b4psm4m
dgianstefaniYou don't need a whole new PSU, use your brain. Exactly the same thing happened with RTX3xxx GPUs, a simple adapter was included in the GPU - problem solved. Old PSU use adapter, new ones use simpler single connection at both ends.
Problem is that the article specifically says the connector is not backwards compatible, so you may not be able to use current supplies. If it was just power delivery then fine, you can do it; but it includes 4 sense wires to determine how the power is to be delivered, how're you going to adapt those?
Posted on Reply
#46
Vayra86
I think this guy knew what's up way earlier than us

Posted on Reply
#47
Bomby569
P4-630How much can one six pin deliver with current PSU's? I thought it was just 75 Watts?.... And one 8 pin was 150 Watts?....
How simple could that adapter be?...
The Pcie slot can only deliver 75 wats. The current cables can only deliver 150 watts (it doesn't matter how many pins), or at least it isn't recommended. That's the limits we have now.
Vayra86I think this guy knew what's up way earlier than us

it's a funny joke, but i really think it's the way we are going, power efficiency is a thing of the past, it seems like in the past it was a concern but not anymore, and it's way easier to just do R&D for more power without any efficiency concerns.
Posted on Reply
#48
Vayra86
lynx29If PSU companies move forward in unison to fix this, I say great, but we also need mobo manufacturers to move forward on creating a new plug and play standard for cases and mobos for LED lights, power buttons, etc. The only part of building PC's I have truly hated over the years was those tiny tiny little wires that go into the mobo from the PC case... sigh...

@TheLostSwede call your friends to have them call their friends, and get all the big bosses in a room together. Unite the clans! Create a new mobo and PC case standard to make life easier! This moment is now your existence! What will you do?!?! UNITE THE CLANS!!! CALL THEM!

only 51 seconds long, watch it brothers!!!

NOW IS OUR CHANCE! NOW!!! IF WE WIN, WE WILL HAVE WHAT NONE OF US HAVE EVER HAD BEFORE BROTHERS! THE TINY WIRES WILL BE GONE!!!

@TheLostSwede @W1zzard UNITE THE CLANS!!!

:rockout::rockout::rockout::rockout::rockout:
But... but... they feel so vintage!
Bomby569it's a funny joke, but i really think it's the way we are going, power efficiency is a thing of the past, it seems like in the past it was a concern but not anymore, and it's way easier to just do R&D for more power without any efficiency concerns.
Yep... most of my jokes go that direction, fun never comes free :D
Posted on Reply
#49
TheinsanegamerN
RichardsAny gpu power consumption of 500 watts should be banned.. you d'not need that much performance
You dont need to play games either, lets ban those too.
ZoneDymoYou might need the performance, what you dont need/want is the consumption, we evolve in many areas by making products more efficient, while gpu's also are more efficient they also consistently use more power.
I am in favor of some law that puts a limit on the power consumption of such a product, let the manufactuers find other ways to squeeze out more performance.
We should legislate how long you are allowed to play games to save energy.
cst1992My GTX 970 only consumes a max of 145 watts besides being the monster that it is(for its time).
My 3060Ti is almost 4x as powerful as the 970, and STILL only consumes 200 watts maximum.
WHY THE HELL do we need individual 600W PCIe power connectors??!! For 1200W graphics cards? Or are there some monster mining chips coming that pack the hash rate of multiple 3090s that I don't know about?
To replace multiple power connectors used now.

The current 8 pin system officially supports 150 watt "per spec" but is more then capable of supporting up to 400 watts with appropriate cabling. This new 12 pin connector is made with the heavier cabling as standard rather then optional, hence the higher power output, and also means only a single connector that can be used on many GPUs rather then 6 pin, 8 pin, 6+2 pin, 2x6+2 pin, 2x6 pin, ece. Now every GPU can use a single 12 pin regardless of power draw, simplifies wiring.

And in case you havent noticed the RTX 3090 and 3080ti are hungry bois on a completely different level then the 3060ti. High end big GPUs have always been power hogs compared to the midrange stuff. Just because the 12 pin exists does not mean that every GPU is now going to be space heater tier.
Posted on Reply
#50
Darmok N Jalad
The other half of this conversation is that not only are we looking at more power consumption with GPUs, but CPUs are going there as well. 95-105W used to be the enthusiast grade chips, but next gen (Adler Lake and Zen4) are rumored to have 165W enthusiast level CPUs. And we already know that rating is a bit of a joke, as the chips will happily exceed that all day long by throwing a few BIOS levers. We very well could see 600W GPUs and CPUs that peak at 500W or better. Things are going to get even more expensive at the high end, as there’s just too much complexity in power delivery and cooling. With so many components to keep powered, I suspect it will even be less efficient at idle.

Now, all that said, if one can be content to not have the best of the best, the mid-grade everything will probably be fairly efficient. Current-gen consoles set the tone for game engines for a long time, and even the next level of mid-grade products should have no trouble being faster than what’s in XSX and PS5.
Posted on Reply
Add your own comment
Nov 21st, 2024 08:36 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts