Sunday, August 23rd 2020

Picture Proof of NVIDIA 12-pin Power Connector: Seasonic Ships Modular Adapters

Leading PSU manufacturer Seasonic is shipping a modular cable that confirms NVIDIA's proprietary 12-pin graphics card power connector for its upcoming GeForce "Ampere" graphics cards. Back in July we did an in-depth analysis of the connector, backed by confirmation from various industry sources about the connector being real, being a proprietary design by NVIDIA (and not a PCI-SIG or ATX standard), and its possible power output limit being 600 W. Seasonic's adapter converts two versatile 12 V 8-pin PSU-side connectors into one 12-pin connector, which tends to back the power output information. On typical Seasonic modular PSUs, cables are included to convert one PSU-side 8-pin 12 V connector into two 6+2 pin PCIe power connectors along a single cable. HardwareLuxxx.de reports that it's already received the Seasonic adapter in preparation for its "Ampere" Founders Edition reviews.

NVIDIA leaker with an extremely high hit-rate, kopite7kimi, however predicts that the 12-pin connector was designed by NVIDIA exclusively for its Founders Edition (reference design) graphics cards, and that custom-design cards may stick to industry-standard PCIe power connectors. We recently spied a custom-design RTX 3090 PCB, and it's shown featuring three 8-pin PCIe power connectors. This seems to be additional proof that a single 12-pin connector is a really fat straw for 12 V juice. The label on the box for the Seasonic cable reads that it's recommended to use the cable with PSUs with at least 850 W output (which could very well be a system requirement for the RTX 3090). Earlier this weekend, pictures of the RTX 3090 Founders Edition surfaced, and it is huge.
Source: VideoCardz
Add your own comment

119 Comments on Picture Proof of NVIDIA 12-pin Power Connector: Seasonic Ships Modular Adapters

#26
jabbadap
Those are not really adapters though. They are modular 12pin cables for modular SeaSonic psu.

But yeah 12-pin connector for RTX30 FEs seems to be certain and those most probably come with 2x8-pin pcie to 12-pin adapter.
Posted on Reply
#27
Ruru
S.T.A.R.S.
RedelZaVednoRecommended to use 850W power supply... Is 3090 an induction heater? F this... no way am I buying GPU with such power demands.
It has been always that they recommend an overkill PSU just because of those el cheapo PSUs which are questionable by quality. A good quality 550W will probably be enough, just guessing.
Posted on Reply
#28
JustAnEngineer
It's probably a good idea to target something under 80% of maximum capacity for a new PSU, since that's where they are most efficient.

Three 8-pin PCIe connectors at 150 watts each plus up to 75 watts from the motherboard through the PCIe x16 slot seems like a tremendous amount of power for a graphics card. Even if it's not using the maximum that that configuration can provide, consider that two 8-pin connectors plus the PCIe slot would have been good for up to 375 watts, and NVidia's engineers determined that it needed another 8-pin power connector. :eek:
Posted on Reply
#30
Ruru
S.T.A.R.S.
JustAnEngineerIt's probably a good idea to target something under 80% of maximum capacity for a new PSU, since that's where they are most efficient.

Three 8-pin PCIe connectors at 150 watts each plus up to 75 watts from the motherboard through the PCIe x16 slot seems like a tremendous amount of power for a graphics card. Even if it's not using the maximum that that configuration can provide, consider that two 8-pin connectors plus the PCIe slot would have been good for up to 375 watts, and NVidia's engineers determined that it needed another 8-pin power connector. :eek:
Technically 6pin and 8pin are the same, the two extra pins are just sense and a ground pin. And the sense pin is a ground too.
Posted on Reply
#31
jayseearr
JustAnEngineerIt's probably a good idea to target something under 80% of maximum capacity for a new PSU, since that's where they are most efficient.

Three 8-pin PCIe connectors at 150 watts each plus up to 75 watts from the motherboard through the PCIe x16 slot seems like a tremendous amount of power for a graphics card. Even if it's not using the maximum that that configuration can provide, consider that two 8-pin connectors plus the PCIe slot would have been good for up to 375 watts, and NVidia's engineers determined that it needed another 8-pin power connector. :eek:
Well ultimately they determined it needed another 4 pin, not another 8 pin right? Either way, i agree...seems like a yikes that they couldn't keep it under 375 watts, even for a flagship *edit never mind, i am a doofus. bad maths
Posted on Reply
#32
ThrashZone
Hi,
Funny outrage over 850w psu lol :roll:
Posted on Reply
#33
jayseearr
I don't think anybody here is outraged, for most people this likely won't even matter anyways. Just a little bit of a head scratcher how they couldn't keep it under 375 watts i suppose...ultimately if it requires more though, i think a 12 pin is obviously a more elegant solution than adding more pins. Truth is, once again, anybody who actually buys this card will 99.9% chance not give a sh*t anyway
Posted on Reply
#34
Ruru
S.T.A.R.S.
ThrashZoneHi,
Funny outrage over 850w psu lol :roll:
'cos 850W is totally overkill for a modern non-HEDT system since SLI/CF is pretty dead. Even my 750W is hella overkill but I did had Crossfire when I bought this, two R9 290 cards are hella power hungry.
Posted on Reply
#35
ThrashZone
Hi,
Chips are getting more and more cores even non-hedt chips like 10900k and even a lot of main stream amd chips 12 core and upwards so it just a number accounting for people using more than 4-6-8 cores and knowing their audience is overclocking everything now days even turbo is getting up there :-)
Posted on Reply
#36
jayseearr
ThrashZoneHi,
Chips are getting more and more cores even non-hedt chips like 10900k and even a lot of main stream amd chips 12 core and upwards so it just a number accounting for people using more than 4-6-8 cores and knowing their audience is overclocking everything now days even turbo is getting up there :)
that is true, when it comes to recommendations like that they are clearly not gospel, but at the same time time they come up with that number for a reason. that can't give a power recommendation for every hardware configuration known to man, so naturally they go on the safe side and factor it with a high end system and some headroom. This is the logical and responsible thing to do. People are free to do the math and calculate their system to exact wattage and buy accordingly if they want, but I am of the mindset that good power supplies come with a solid warranty so why not buy one that's a little "overkill" in case you want to upgrade to more power hungry components in the future. I dont need the 750 watts that i have in my system at all...not even close really. But I'm still happy that i bought a 750 watt psu =)
Posted on Reply
#37
ThrashZone
Hi,
Bingo
So a 750w when most gpu's to date were 600-650w recommended worked out just fine for most people.
But being triggered over this is funny lol :)

I mean I have 1200P2 on x299 & 1000P2 on z490 I also have a 850P2 on x99
Posted on Reply
#39
ThrashZone
Hi,
Yeah one could figure once nvidia said 2x8 was bulky and a pain to route for builders this 1-12 pin nonsense was just silly.

Hopefully real gpu makers will go on as they have always done 2x8 or on monsters like king|pin 3x8 and forget this nvidia 12 pin nonsense plug entirely :rockout:
Posted on Reply
#40
bonehead123
Next upcoming headline:


RTX 3090xxx... THE world's first & only neutrino-based, graviton accelerater powered GPU's, featuring our patented, atmosphere-depleting quadruple super-duper LH2^⁹⁹ cooling system. And with an MSRP of a mere $5.48752MM USD, everyone who uses any computer anywhere for any purpose will be telepathically compelled to buy one, or 2, or maybe even12 !

Yep, you heard it here 1st folks,....

And yes, you're welcome ..:laugh:..:D..o_O
Posted on Reply
#41
Ruru
S.T.A.R.S.
ThrashZoneHi,
Chips are getting more and more cores even non-hedt chips like 10900k and even a lot of main stream amd chips 12 core and upwards so it just a number accounting for people using more than 4-6-8 cores and knowing their audience is overclocking everything now days even turbo is getting up there :)
10900K (and 9900K) are totally different things when power limits are removed, those are true toasters but a typical gamer doesn't even overclock these days.

My argument still stays, 850W is overkill for a single-GPU and non-HEDT system.
Posted on Reply
#42
ThrashZone
Chloe Price10900K (and 9900K) are totally different things, those are true toasters but a typical gamer doesn't even overclock these days.

My argument still stays, 850W is overkill for a single-GPU and non-HEDT system.
Hi,
Nvidia is not clairvoyant 850w is only a recommended spec not written in stone
Lowball for my 9940x at 4.8 & higher and likely lowball for my 10900k at 5.0 & higher I'm not triggered over that pretty much why I overkill stuff and not bare minimum anything but maybe case :-)
Posted on Reply
#43
PowerPC
ThrashZoneHi,
Chips are getting more and more cores even non-hedt chips like 10900k and even a lot of main stream amd chips 12 core and upwards so it just a number accounting for people using more than 4-6-8 cores and knowing their audience is overclocking everything now days even turbo is getting up there :)
Good note and something that I would actually like to know. The fact that it's so big and draws so much power, is called 3090 (above the max xx80 from before) and has the new architecture must surely mean that its performance is beast. Do we even know anything about its performance yet? I feel like most of the posts are just critiquing this card without any actually relevant information.

Because if rumors of this are true, and it really offers something ridiculous like 50% more performance at half the power of Turing, which would mean it'll be like 100% faster per Watt than 2080 ti, using way more Watts... It could literally demolish the 2080 ti. Why is nobody talking about this possibility? Even if it's just a rumor, it also kinda makes sense to me so far looking at the leaks of the size, cooling, and price of this thing. If it's like 90% faster than 2080 ti, many people won't be able to hold on to their wallets.
Posted on Reply
#44
ThrashZone
PowerPCGood note and something that I would actually like to know. The fact that it's so big and draws so much power, is called 3090 (above the max xx80 from before) and despite the new architecture must surely mean that its performance is beast. Do we even know anything about its performance yet? I feel like most of the posts are just critiquing this card without any actually relevant information.

Because if rumors of this are true, and it really offers something ridiculous like 50% more performance at half the power of Turing, which would mean it'll be like 100% faster per Watt than 2080 ti, using way more Watts... It could literally demolish the 2080 ti. Why is nobody talking about this possibility? Even if it's just a rumor, it also kinda makes sense to me so far looking at the leaks of the size, cooling, and price of this thing.
Hi,
Not unusual for newest to perform higher than previous releases
Pretty much why it's best to wait and buy 1-2 years after release or buy one generation back so you can take advantage of depreciation from high first release prices
I did not buy 20 series thankfully just held out and yes it was tough seeing the benchmark numbers of 2080ti :eek:
Posted on Reply
#45
Ruru
S.T.A.R.S.
ThrashZoneHi,
Nvidia is not clairvoyant 850w is only a recommended spec not written in stone
Lowball for my 9940x at 4.8 & higher and likely lowball for my 10900k at 5.0 & higher I'm not triggered over that pretty much why I overkill stuff and not bare minimum anything but maybe case :)
9940X is HEDT and 10900K is a toaster if you take the power limit away. Like I said, a typical gamer doesn't have a HEDT platform or doesn't run their Intel toasters with power limit off.

Though typical gamer these days have something like 9700K, 10700K, AMD 2600-2700X or 3600-3700X. But I guess these are just opinions.
Posted on Reply
#46
PowerPC
ThrashZoneHi,
Not unusual for newest to perform higher than previous releases
Pretty much why it's best to wait and buy 1-2 years after release or buy one generation back so you can take advantage of depreciation from high first release prices
I did not buy 20 series thankfully just held out and yes it was tough seeing the benchmark numbers of 2080ti :eek:
Yea, but I mean this time could be a much bigger jump than usual. 2080 ti was "just" around 35% faster than 1080 ti. I would imagine people would throw their money at it if 3090 actually turns out to be 90% faster than 2080 ti. That would be a totally different kind of proposition.
Posted on Reply
#47
Ruru
S.T.A.R.S.
I wouldn't be surprised if they would release a 3090 Ti later, just like 780 -> 780 Ti back in the day with a full chip, I don't know does the 3090 have anything disabled or not..
Posted on Reply
#48
ThrashZone
PowerPCYea, but I mean this time could be a much bigger jump than usual. 2080 ti was "just" around 35% faster than 1080 ti. I would imagine people would throw their money at it if 3090 actually turns out to be 90% faster than 2080 ti. That would be a totally different kind of proposition.
Hi,
3090 is likely the titan rtx replacement seeing how much memory it might have so I'm guessing 2500.us range not 1400.us
1400.us is likely 3080ti price.
All this is rumors so no telling until actual release.
Posted on Reply
#49
R0H1T
TurmaniaNvidia for the last decade, went for less power and more performance route and has been highly successful with this approach. So in general it does not make sense they will sacrifice power consumption figures for more power. But, we will know for sure when review comes up.
Part of the reason is they didn't need to sacrifice efficiency for absolute performance (leadership) especially after the GTX 480/580 infernos. That was the brief time when AMD beat them in performance & efficiency across the board, with the launch of Radeon Evergreen IIRC. This leads me to believe there's a couple of possible hurdles which they may have faced forcing them to abandon their last decade's tried & tested strategy ~ (1) RDNA2 is a real beast & probably comes (dangerously) close to Ampere's efficiency &/or performance.

(2) It's possible AMD may launch RDNA3 in about a year from now, with TSMC's 7nm & 5nm capacities being nearly booked fully there's little chance Nvidia can compete with AMD, Intel in the medium term with an inferior(?) node. If that's the case then they're also in serious trouble because there could be a major price war much like we've seen in the CPU space. They're trying to make as much $ as possible upfront!
Posted on Reply
#50
thesmokingman
They're gonna turn off a lot of buyers with this so i don't see how someone has money hungry as Jensen would allow this to happen.
Posted on Reply
Add your own comment
Sep 30th, 2024 09:16 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts