Tuesday, September 8th 2020

Corsair Working On Direct 12-Pin NVIDIA Ampere Power Cable
NVIDIA Introduced a new 12-pin power connector with their RTX 30 series founder edition cards to accommodate the higher power draw. The new RTX 30-Series cards feature GPU power requirements of 220 W, 320 W, and 350 W for the RTX 3070, 3080, and 3090 respectively. The new 12-pin connector is roughly the same size as a single 8-pin PCIe connector but can provide significantly more power in that same space. NVIDIA will supply an adapter in the box to convert two 8-pin connectors to a single 12-pin connector, however this will require extra cable management and introduce another point of failure.
Corsair has announced they are developing a custom cable that will be fully compatible with all Type 3 and Type 4 CORSAIR modular power supplies to allow for a clean connection. The new cable connects two PCIe / CPU PSU ports directly to the new 12-pin connector and is currently undergoing development and testing. Corsair also now recommends a PSU rating of 850 watts or higher for the RTX 3090. The Corsair 12-pin cable should be available for sale by September 17th the same day as the RTX 30 series cards, pricing wasn't announced but you can sign up to be notified here.
Source:
Corsair
Corsair has announced they are developing a custom cable that will be fully compatible with all Type 3 and Type 4 CORSAIR modular power supplies to allow for a clean connection. The new cable connects two PCIe / CPU PSU ports directly to the new 12-pin connector and is currently undergoing development and testing. Corsair also now recommends a PSU rating of 850 watts or higher for the RTX 3090. The Corsair 12-pin cable should be available for sale by September 17th the same day as the RTX 30 series cards, pricing wasn't announced but you can sign up to be notified here.
51 Comments on Corsair Working On Direct 12-Pin NVIDIA Ampere Power Cable
Higher capacity PSUs (Wattage) also are able to deliver more Amperage over the 12V rail, which is needed for the new GPUs.
It's clearly easier and cheaper for the manufacturers to upsell to a higher wattage PSU than increase available Amperage on lower wattage PSUs.
Yes. There are some PSU's that deliver much less of their total capability on the +12V rail. Especially those units that use group regulation or older PSUs that put more emphasis on the +5V because they pre-date or weren't meant for PCs with discrete graphics, but that's not what we're talking about in this thread. Corsair suggested at least an 850W PSU. A Corsair 750W PSU (say, an RM750, for example) has 62.5A available on the +12V rail. That's 750W dead on.
So you're trying to make a point. What is it? That a Corsair 750W PSU can't actually deliver 750W on the +12V rail and that is why they suggest an 850W PSU? And if so, how do you come to that conclusion? Just based on the fact that they're suggesting 850W leads you to believe that their 750W is "fake"?
And as bubbleawesome is suggesting.... 648W (the 12-pin's maximum capability) is < 750W. And that load is split across two connectors at the PSU. Let's say it's an even 324W each. So you're saying that they're suggesting an 850W PSU because the 750W version can't output 324W from each modular connector?
I thought what I stated was quite clear.
I clearly stated 12V rail power delivery, never implying or mentioning 5V.
You also keep talking about Watts, when I specifically mention Amps.
And I am talking about watts because we're talking about a +12V rail. A * V = W.
I mentioned the +5V in that example because older PSUs will have more power on the +5V and less on the +12V and therefore are unsuitable for a computer that require more on the +12V.... like one with a discrete graphics card.
You were saying there wasn't enough AMPERAGE when the PSU is rated a certain WATTAGE. Your words. But if that AMPERAGE isn't on the +12V, then where is that WATTAGE????
You're not even reading what I'm typing and I don't think you even understand what you're even saying yourself. I'm pretty sure you're just trolling because nobody in this world can be this thick.
My own speculation will be, that modern cards will not use in full the potentials of 12Pin connector.
My second speculation will be that people with out modular PSU, they will follow NVIDIA's suggestion (2x 6+2Pins (8Pin) in to 12Pin wire.
In other words there is no dead-end, and actually they are many options available.
CORSAIR compatibility chart this is of no value, or in favor of their own marketing.
Very soon the 12Pin connector will be shown on ebay at 7$ shipped with a set of female pins.
Then I might get the gadget and install it over the Corsair CX750.
Our true problem this is VGA cards pricing, and not the power cables.
Because such a move it will exclude from the list of potential buyer about the purchase of a fresh VGA, 99.9% of available PC systems in the market.
The more interesting part is why Nvidia made such an overpowered connector in the first place, for which the answer is probably something like a combination of space savings, headroom (OC, stability, etc.), and general peace of mind. Shunt modded 2080 Tis can hit 400W, and that's a 275W GPU, so it stands to reason that extreme overclocking of a 3090 would push it far beyond the rated capabilities of 2x8-pin PCIe, requiring a third, which would take up a lot of board space and require some fugly cabling. The 12-pin provides even more power than this in a much smaller footprint with more manageable cabling. I'd call that a win-win scenario.
To put this in perspective, the 6pin PCI-E connector is rated at 75w by PCI-SIG, if you took two of them and glued them together to make a 12pin it would be rated at 150w, if you then shrunk it to 70% of it's original size it would be rated at 105w. That is the exact process Nvidia used to "invent" this new 12pin, the only reason they claim 600w+ is because it doesn't have to conform to any standards, which is proberbly one of the reasons none of the AIB are using it.
If this is true? then we all were mislead at spending to much brain-cells energy for no reason.
2) Pins don't equal power. A PCI-SIG 8-pin can do 150w, as much as your invented 12-pin.
3) That isn't the "exact process Nvidia used to 'invent' this new 12pin", that would leave 2 pins unassigned, and make terrible use of everything else. You don't need to double the ground and other wires when doubling to 12 pin, so instead of 4 specced +12v pins in a doubled 6-pin you could easily run 6 or 7.
4) Shrinking the connector means nothing in terms of power delivery, especially when the wire gauge is increased like it's supposed to be for the 12-pin.
EDIT: In fact by just googling it, it does run 6 +12v, so as many +12v pins as two 8-pins combined. Even without the gauge increase and other bits that's at least 300w right there. But remember, the 75w increase on the 8-pin over the 6-pin comes from only 1 extra +12v pin.
So yes, Nvidia absolutely has to conform to standards, and your reasoning is completely off. The reason AIB partners aren't using the connector is likely a combination of a) cost (a new connector = higher price, new supply chain, new tooling to implement, new PCB design guidelines etc.), b) experience/convenience (more than a decade of using PCIe connectors making them simple and secure to implement), and c) necessity. When you can copy+paste in a 3x8-pin poewr entry design from a few years back for your upcoming power hog flagship, why take on the expense of using a new connector that will just eat into your margins while forcing you to also bundle in a (again, expensive) cable adapter as nobody has a PSU with this connector on it? It isn't necessary unless you're building a tiny PCB or need more than the cumulative 525W that 3x150W 8-pins + 75W from the slot can deliver. And the 3090 isn't that power hungry.
An normal thinking electrician engineer he will use this information and he will design his application, this to use as continues power 40% less than that.
No one drives a car at the end of the speedometer.
GPU and entire card this does not have constant power requirements even at gaming, the requested (needed) electric power from the card will be constantly fluctuate.
The weak link in this math this is the thermal issue which it could melt the connector housing, and this would happen as soon a pair of pins (male - female) they have start arc-flash to its other.
Unfortunately even the best PSU, the engineers they do not have predict (and it is almost not possible) to include an PSU active protection at the event of high DC current spikes.
In such an event the PSU 12V rail will be blown first, and pins will turn to unusable due to the carbon deposits over them.
This similes to walk in hell situation, and therefore all math about Watt sizing it should be performed by taking in to account the nominal standards (specifications).
NVIDIA this is very capable to perform electrical tests, and they did that at the time of QC testing over the R&D period.
Electrical measurements by the use of qualifying tools this is the language of truth, because everything it can be remeasured and confirmed.
If NVIDIA Marketing people wish to build new myths, the electrical measurements are the ones which will destroy anything as fake news.
I am in love with the sector of electrical test and measurement, because I am now capable to kill any doubt within seconds.
This is a mistake that many do and they should stop doing it.
Naturally such a change it will require significant cash investment in equipment, but there is no other way if they are up to deliver realistic product comparisons.