Tuesday, September 8th 2020

Corsair Working On Direct 12-Pin NVIDIA Ampere Power Cable

NVIDIA Introduced a new 12-pin power connector with their RTX 30 series founder edition cards to accommodate the higher power draw. The new RTX 30-Series cards feature GPU power requirements of 220 W, 320 W, and 350 W for the RTX 3070, 3080, and 3090 respectively. The new 12-pin connector is roughly the same size as a single 8-pin PCIe connector but can provide significantly more power in that same space. NVIDIA will supply an adapter in the box to convert two 8-pin connectors to a single 12-pin connector, however this will require extra cable management and introduce another point of failure.

Corsair has announced they are developing a custom cable that will be fully compatible with all Type 3 and Type 4 CORSAIR modular power supplies to allow for a clean connection. The new cable connects two PCIe / CPU PSU ports directly to the new 12-pin connector and is currently undergoing development and testing. Corsair also now recommends a PSU rating of 850 watts or higher for the RTX 3090. The Corsair 12-pin cable should be available for sale by September 17th the same day as the RTX 30 series cards, pricing wasn't announced but you can sign up to be notified here.
Source: Corsair
Add your own comment

51 Comments on Corsair Working On Direct 12-Pin NVIDIA Ampere Power Cable

#26
Caring1
jonnyGURUThat statement doesn't sound well thought out. The suggested wattage has nothing to do with the power supply's capability and everything to do with the speculated power consumption of the system itself.
Actually you probably didn't understand what I meant by my comment.
Higher capacity PSUs (Wattage) also are able to deliver more Amperage over the 12V rail, which is needed for the new GPUs.
It's clearly easier and cheaper for the manufacturers to upsell to a higher wattage PSU than increase available Amperage on lower wattage PSUs.
Posted on Reply
#27
bubbleawsome
Caring1Actually you probably didn't understand what I meant by my comment.
Higher capacity PSUs (Wattage) also are able to deliver more Amperage over the 12V rail, which is needed for the new GPUs.
It's clearly easier and cheaper for the manufacturers to upsell to a higher wattage PSU than increase available Amperage on lower wattage PSUs.
But at a set voltage the only way to get more wattage is with more amps no? And unless Corsair is trying a 1x8-pin to 12-pin connector for some awful reason, a 'standard' 2x8-pin to 12-pin shouldn't be drawing more amps across the rail than the 8-pins themselves would normally?
Posted on Reply
#28
jonnyGURU
Caring1Actually you probably didn't understand what I meant by my comment.
Higher capacity PSUs (Wattage) also are able to deliver more Amperage over the 12V rail, which is needed for the new GPUs.
It's clearly easier and cheaper for the manufacturers to upsell to a higher wattage PSU than increase available Amperage on lower wattage PSUs.
I'm trying to understand what you are implying in the context of the current discussion.

Yes. There are some PSU's that deliver much less of their total capability on the +12V rail. Especially those units that use group regulation or older PSUs that put more emphasis on the +5V because they pre-date or weren't meant for PCs with discrete graphics, but that's not what we're talking about in this thread.
Caring1Upselling PSUs based on capacity, instead of fixing what they already have to deliver the correct power draw over the PCI-e cable.
Good job.
Corsair suggested at least an 850W PSU. A Corsair 750W PSU (say, an RM750, for example) has 62.5A available on the +12V rail. That's 750W dead on.

So you're trying to make a point. What is it? That a Corsair 750W PSU can't actually deliver 750W on the +12V rail and that is why they suggest an 850W PSU? And if so, how do you come to that conclusion? Just based on the fact that they're suggesting 850W leads you to believe that their 750W is "fake"?

And as bubbleawesome is suggesting.... 648W (the 12-pin's maximum capability) is < 750W. And that load is split across two connectors at the PSU. Let's say it's an even 324W each. So you're saying that they're suggesting an 850W PSU because the 750W version can't output 324W from each modular connector?
Posted on Reply
#29
R-T-B
jonnyGURUAnd as bubbleawesome is suggesting.... 648W (the 12-pin's maximum capability)
Woah. I thought it was less than that... like in the 350W range. Need to update myself again, I guess.
Posted on Reply
#30
jonnyGURU
R-T-BWoah. I thought it was less than that... like in the 350W range. Need to update myself again, I guess.
9A * 12V * 6 pins.
Posted on Reply
#31
Caring1
jonnyGURUI'm trying to understand what you are implying in the context of the current discussion.

Yes. There are some PSU's that deliver much less of their total capability on the +12V rail. Especially those units that use group regulation or older PSUs that put more emphasis on the +5V because they pre-date or weren't meant for PCs with discrete graphics, but that's not what we're talking about in this thread.
You really are confused aren't you.
I thought what I stated was quite clear.
I clearly stated 12V rail power delivery, never implying or mentioning 5V.
You also keep talking about Watts, when I specifically mention Amps.
Posted on Reply
#32
jonnyGURU
Caring1You really are confused aren't you.
I thought what I stated was quite clear.
I clearly stated 12V rail power delivery, never implying or mentioning 5V.
You also keep talking about Watts, when I specifically mention Amps.
I don't seem to be the one confused here.

And I am talking about watts because we're talking about a +12V rail. A * V = W.

I mentioned the +5V in that example because older PSUs will have more power on the +5V and less on the +12V and therefore are unsuitable for a computer that require more on the +12V.... like one with a discrete graphics card.

You were saying there wasn't enough AMPERAGE when the PSU is rated a certain WATTAGE. Your words. But if that AMPERAGE isn't on the +12V, then where is that WATTAGE????

You're not even reading what I'm typing and I don't think you even understand what you're even saying yourself. I'm pretty sure you're just trolling because nobody in this world can be this thick.
Posted on Reply
#33
kiriakost
Caring1Higher capacity PSUs (Wattage) also are able to deliver more Amperage over the 12V rail, which is needed for the new GPUs.
What is needed this is what it is electrically measured by our modern tools, and so far all that we have are speculations.
My own speculation will be, that modern cards will not use in full the potentials of 12Pin connector.

My second speculation will be that people with out modular PSU, they will follow NVIDIA's suggestion (2x 6+2Pins (8Pin) in to 12Pin wire.
In other words there is no dead-end, and actually they are many options available.

CORSAIR compatibility chart this is of no value, or in favor of their own marketing.
Very soon the 12Pin connector will be shown on ebay at 7$ shipped with a set of female pins.
Then I might get the gadget and install it over the Corsair CX750.

Our true problem this is VGA cards pricing, and not the power cables.
Posted on Reply
#34
bubbleawsome
R-T-BWoah. I thought it was less than that... like in the 350W range. Need to update myself again, I guess.
Also they aren't planning to run anywhere near that much through it from what I've seen. For now it's just a smaller way to supply 2x8-pin worth of power really, just that it technically *could* allow much higher power draw without a second 12-pin.
Posted on Reply
#35
kiriakost
bubbleawsomeAlso they aren't planning to run anywhere near that much through it from what I've seen. For now it's just a smaller way to supply 2x8-pin worth of power really, just that it technically *could* allow much higher power draw without a second 12-pin.
It is not at anyone best interest to trash older electrical specifications of 2x8-pin.
Because such a move it will exclude from the list of potential buyer about the purchase of a fresh VGA, 99.9% of available PC systems in the market.
Posted on Reply
#36
bubbleawsome
kiriakostIt is not at anyone best interest to trash older electrical specifications of 2x8-pin.
Because such a move it will exclude from the list of potential buyer about the purchase of a fresh VGA, 99.9% of available PC systems in the market.
Well in that case they could have made the adapters branch into 3x8-pin or even 4x8-pin. I just meant that even though the 12-pin supports ~650w doesn't mean that GPUs that use the connector need that much power. Many people are used to cards only using as many power connectors as they need, where a 3x8-pin card will always use more power than a card using 6-pin+8-pin, and that's just not true here.
Posted on Reply
#37
kiriakost
bubbleawsomeWell in that case they could have made the adapters branch into 3x8-pin or even 4x8-pin.
This is a logical thinking, but PSU makers they start steering away from 4x8-pin because SLI this is close to end of life.
bubbleawsomeI just meant that even though the 12-pin supports ~650w doesn't mean that GPUs that use the connector need that much power. Many people are used to cards only using as many power connectors as they need, where a 3x8-pin card will always use more power than a card using 6-pin+8-pin, and that's just not true here.
I do totally agree with this theory, I bet that as soon real benchmarks hit the road, we will have more clues and facts. :)
Posted on Reply
#38
Valantar
kiriakostMy own speculation will be, that modern cards will not use in full the potentials of 12Pin connector.
You don't really need to speculate about that; it's clear that they won't given that the most power hungry GPU with this connector is rated for 350W TBP. If history is anything to go by, Nvidia's GPUs just get more tightly locked down in terms of power draw for every new generation, and it would be very surprising if Ampere were to break this trend. So that's not just speculation, it's bordering on self-evident fact from the two facts that a) the connector is able to deliver ~650W and b) the most power hungry GPU with said connector is rated for 350W. Combining the two hardly qualifies as speculation.

The more interesting part is why Nvidia made such an overpowered connector in the first place, for which the answer is probably something like a combination of space savings, headroom (OC, stability, etc.), and general peace of mind. Shunt modded 2080 Tis can hit 400W, and that's a 275W GPU, so it stands to reason that extreme overclocking of a 3090 would push it far beyond the rated capabilities of 2x8-pin PCIe, requiring a third, which would take up a lot of board space and require some fugly cabling. The 12-pin provides even more power than this in a much smaller footprint with more manageable cabling. I'd call that a win-win scenario.
Posted on Reply
#39
vstherock
so I guess my AX1200 is not compatible
Posted on Reply
#40
kiriakost
vstherockso I guess my AX1200 is not compatible
You may considered it as compatible for as long the warranty this is active.
Posted on Reply
#41
Ubersonic
ValantarThe more interesting part is why Nvidia made such an overpowered connector in the first place
Nvidia's power claims for the connector need to be taken with an enormous pinch of salt. It must be remembered that this is not an official connector that applies to any PC standards. Therefore Nvidia can make up whatever numbers they like.

To put this in perspective, the 6pin PCI-E connector is rated at 75w by PCI-SIG, if you took two of them and glued them together to make a 12pin it would be rated at 150w, if you then shrunk it to 70% of it's original size it would be rated at 105w. That is the exact process Nvidia used to "invent" this new 12pin, the only reason they claim 600w+ is because it doesn't have to conform to any standards, which is proberbly one of the reasons none of the AIB are using it.
Posted on Reply
#42
kiriakost
UbersonicNvidia's power claims for the connector need to be taken with an enormous pinch of salt. It must be remembered that this is not an official connector that applies to any PC standards. Therefore Nvidia can make up whatever numbers they like.

To put this in perspective, the 6pin PCI-E connector is rated at 75w by PCI-SIG, if you took two of them and glued them together to make a 12pin it would be rated at 150w, if you then shrunk it to 70% of it's original size it would be rated at 105w. That is the exact process Nvidia used to "invent" this new 12pin, the only reason they claim 600w+ is because it doesn't have to conform to any standards, which is proberbly one of the reasons none of the AIB are using it.
There is now a new circulating rumor, that only Developer Edition cards will use 12P.
If this is true? then we all were mislead at spending to much brain-cells energy for no reason.
Posted on Reply
#43
bubbleawsome
UbersonicNvidia's power claims for the connector need to be taken with an enormous pinch of salt. It must be remembered that this is not an official connector that applies to any PC standards. Therefore Nvidia can make up whatever numbers they like.

To put this in perspective, the 6pin PCI-E connector is rated at 75w by PCI-SIG, if you took two of them and glued them together to make a 12pin it would be rated at 150w, if you then shrunk it to 70% of it's original size it would be rated at 105w. That is the exact process Nvidia used to "invent" this new 12pin, the only reason they claim 600w+ is because it doesn't have to conform to any standards, which is probably one of the reasons none of the AIB are using it.
1) It's well known PCI-e connectors can safely supply more than their rated wattage by a fair amount
2) Pins don't equal power. A PCI-SIG 8-pin can do 150w, as much as your invented 12-pin.
3) That isn't the "exact process Nvidia used to 'invent' this new 12pin", that would leave 2 pins unassigned, and make terrible use of everything else. You don't need to double the ground and other wires when doubling to 12 pin, so instead of 4 specced +12v pins in a doubled 6-pin you could easily run 6 or 7.
4) Shrinking the connector means nothing in terms of power delivery, especially when the wire gauge is increased like it's supposed to be for the 12-pin.

EDIT: In fact by just googling it, it does run 6 +12v, so as many +12v pins as two 8-pins combined. Even without the gauge increase and other bits that's at least 300w right there. But remember, the 75w increase on the 8-pin over the 6-pin comes from only 1 extra +12v pin.
Posted on Reply
#44
Valantar
UbersonicNvidia's power claims for the connector need to be taken with an enormous pinch of salt. It must be remembered that this is not an official connector that applies to any PC standards. Therefore Nvidia can make up whatever numbers they like.

To put this in perspective, the 6pin PCI-E connector is rated at 75w by PCI-SIG, if you took two of them and glued them together to make a 12pin it would be rated at 150w, if you then shrunk it to 70% of it's original size it would be rated at 105w. That is the exact process Nvidia used to "invent" this new 12pin, the only reason they claim 600w+ is because it doesn't have to conform to any standards, which is proberbly one of the reasons none of the AIB are using it.
... that's not how this works. The pins and housing used (Micro-Fit rather than the Mini-Fit Jr. used for PCIe plugs) are entirely standardized. Just because they haven't been adopted for a PC power delivery standard before this doesn't mean that Nvidia pulled these plugs out of their collective rear ends. In this case, the Micro-fit pins are rated for 9A (though versions exist for lower amperages as well as a 10.5A version). That's where the number comes from: 9A * 12V * 6 pins = 648W total power output for a connector with 6 +12V and 6 ground pins. This of course requires a suitable power source as well as wiring thick enough to handle that amount of current per pin, which means some really thick cabling to reach even a relatively short ATX PSU length. (Most PSU cabling is 18 AWG, which would heat up massively and cause significant voltage drops if asked to transmit 9A (according to this calculator, .23V drop over 60cm or .38V over 1m, with the entirety of that voltage drop being converted to heat in the cable, i.e. .23 * 12 * 9 = 25W or .38 * 12 * 9 = 41W of heat in your cable alone - good luck cooling that). That would make for some very melty connectors, so you'd need thicker gauge cables to rectify this. 14 AWG wiring would mean just .15V drop over 1m of wiring, which is a lot more tolerable, but that's also a cable with more than 2x the copper, i.e. a very thicc boi. Now imagine 12 of those .... yeah, that's not a cable you want to manage inside of a cramped case.

So yes, Nvidia absolutely has to conform to standards, and your reasoning is completely off. The reason AIB partners aren't using the connector is likely a combination of a) cost (a new connector = higher price, new supply chain, new tooling to implement, new PCB design guidelines etc.), b) experience/convenience (more than a decade of using PCIe connectors making them simple and secure to implement), and c) necessity. When you can copy+paste in a 3x8-pin poewr entry design from a few years back for your upcoming power hog flagship, why take on the expense of using a new connector that will just eat into your margins while forcing you to also bundle in a (again, expensive) cable adapter as nobody has a PSU with this connector on it? It isn't necessary unless you're building a tiny PCB or need more than the cumulative 525W that 3x150W 8-pins + 75W from the slot can deliver. And the 3090 isn't that power hungry.
Posted on Reply
#45
kiriakost
Valantarthe Micro-fit pins are rated for 9A
This is the upper limit before the room get full of smoke.
An normal thinking electrician engineer he will use this information and he will design his application, this to use as continues power 40% less than that.
No one drives a car at the end of the speedometer.
Posted on Reply
#46
Valantar
kiriakostThis is the upper limit before the room get full of smoke.
An normal thinking electrician engineer he will use this information and he will design his application, this to use as continues power 40% less than that.
No one drives a car at the end of the speedometer.
I know. But that's what they're rated for. Ratings also typically include some headroom (it would be rather problematic if they melted at 9.1A ...), but staying well below the rating is obviously the safer choice (especially considering that power draw fluctuates in nearly any application, so aiming for the maximum spec is deeply problematic). That doesn't change the fact that @Ubersonic's argument was flat out wrong though.
Posted on Reply
#47
kiriakost
ValantarI know. But that's what they're rated for. Ratings also typically include some headroom (it would be rather problematic if they melted at 9.1A ...), but staying well below the rating is obviously the safer choice (especially considering that power draw fluctuates in nearly any application, so aiming for the maximum spec is deeply problematic). That doesn't change the fact that @Ubersonic's argument was flat out wrong though.
Typically the headroom for the pin alone it can be no more than 25% for a limited time.
GPU and entire card this does not have constant power requirements even at gaming, the requested (needed) electric power from the card will be constantly fluctuate.
The weak link in this math this is the thermal issue which it could melt the connector housing, and this would happen as soon a pair of pins (male - female) they have start arc-flash to its other.

Unfortunately even the best PSU, the engineers they do not have predict (and it is almost not possible) to include an PSU active protection at the event of high DC current spikes.
In such an event the PSU 12V rail will be blown first, and pins will turn to unusable due to the carbon deposits over them.
This similes to walk in hell situation, and therefore all math about Watt sizing it should be performed by taking in to account the nominal standards (specifications).

NVIDIA this is very capable to perform electrical tests, and they did that at the time of QC testing over the R&D period.
Electrical measurements by the use of qualifying tools this is the language of truth, because everything it can be remeasured and confirmed.
If NVIDIA Marketing people wish to build new myths, the electrical measurements are the ones which will destroy anything as fake news.
I am in love with the sector of electrical test and measurement, because I am now capable to kill any doubt within seconds.
Posted on Reply
#48
Valantar
kiriakostTypically the headroom for the pin alone it can be no more than 25% for a limited time.
GPU and entire card this does not have constant power requirements even at gaming, the requested (needed) electric power from the card will be constantly fluctuate.
The weak link in this math this is the thermal issue which it could melt the connector housing, and this would happen as soon a pair of pins (male - female) they have start arc-flash to its other.

Unfortunately even the best PSU, the engineers they do not have predict (and it is almost not possible) to include an PSU active protection at the event of high DC current spikes.
In such an event the PSU 12V rail will be blown first, and pins will turn to unusable due to the carbon deposits over them.
This similes to walk in hell situation, and therefore all math about Watt sizing it should be performed by taking in to account the nominal standards (specifications).

NVIDIA this is very capable to perform electrical tests, and they did that at the time of QC testing over the R&D period.
Electrical measurements by the use of qualifying tools this is the language of truth, because everything it can be remeasured and confirmed.
If NVIDIA Marketing people wish to build new myths, the electrical measurements are the ones which will destroy anything as fake news.
I am in love with the sector of electrical test and measurement, because I am now capable to kill any doubt within seconds.
Again, entirely agree. But GPU reviews from reputable sites using advanced testing equipment (Tom's hardware does excellent GPU power testing, and has for years) has shown clearly that GPUs don't spike - even momentarily - all that far beyond their rated or measured total power draw. Sure, there are spikes, but at most ~50W for a high end GPU, and for a few ms at most. This would of course change with modding like shunt mods or extreme overclockers using their own power delivery circuitry entirely, but the latter would also bypass the 12-pin connector. So given a rated 9A per pin for the connector, using it for something like a (theoretical) 500W average/rated power draw GPU would be entirely safe as long as the wiring from the PSU is of a sufficient gauge. The chances of that GPU spiking even past 600W for a few ms would be minuscule, let alone past 650W for any period of time.
Posted on Reply
#49
kiriakost
ValantarAgain, entirely agree. But GPU reviews from reputable sites using advanced testing equipment (Tom's hardware does excellent GPU power testing, and has for years) has shown clearly that GPUs don't spike - even momentarily - all that far beyond their rated or measured total power draw.
You can hardly record specific spikes when you are measuring the energy from the ACV side.
This is a mistake that many do and they should stop doing it.
Naturally such a change it will require significant cash investment in equipment, but there is no other way if they are up to deliver realistic product comparisons.
Posted on Reply
#50
Valantar
kiriakostYou can hardly record specific spikes when you are measuring the energy from the ACV side.
This is a mistake that many do and they should stop doing it.
Naturally such a change it will require significant cash investment in equipment, but there is no other way if they are up to deliver realistic product comparisons.
I'm obviously not talking about people measuring AC power draw. I'm talking about people measuring GPU-only DC draw from both the PCIe slot and power connectors. Sure, lots of people still do what you say, but that's a mode of measurement that's barely any better than software readings.
Posted on Reply
Add your own comment
Apr 15th, 2025 06:22 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts