Sunday, August 23rd 2020

Picture Proof of NVIDIA 12-pin Power Connector: Seasonic Ships Modular Adapters

Leading PSU manufacturer Seasonic is shipping a modular cable that confirms NVIDIA's proprietary 12-pin graphics card power connector for its upcoming GeForce "Ampere" graphics cards. Back in July we did an in-depth analysis of the connector, backed by confirmation from various industry sources about the connector being real, being a proprietary design by NVIDIA (and not a PCI-SIG or ATX standard), and its possible power output limit being 600 W. Seasonic's adapter converts two versatile 12 V 8-pin PSU-side connectors into one 12-pin connector, which tends to back the power output information. On typical Seasonic modular PSUs, cables are included to convert one PSU-side 8-pin 12 V connector into two 6+2 pin PCIe power connectors along a single cable. HardwareLuxxx.de reports that it's already received the Seasonic adapter in preparation for its "Ampere" Founders Edition reviews.

NVIDIA leaker with an extremely high hit-rate, kopite7kimi, however predicts that the 12-pin connector was designed by NVIDIA exclusively for its Founders Edition (reference design) graphics cards, and that custom-design cards may stick to industry-standard PCIe power connectors. We recently spied a custom-design RTX 3090 PCB, and it's shown featuring three 8-pin PCIe power connectors. This seems to be additional proof that a single 12-pin connector is a really fat straw for 12 V juice. The label on the box for the Seasonic cable reads that it's recommended to use the cable with PSUs with at least 850 W output (which could very well be a system requirement for the RTX 3090). Earlier this weekend, pictures of the RTX 3090 Founders Edition surfaced, and it is huge.
Source: VideoCardz
Add your own comment

119 Comments on Picture Proof of NVIDIA 12-pin Power Connector: Seasonic Ships Modular Adapters

#1
RedelZaVedno
Recommended to use 850W power supply... Is 3090 an induction heater? F this... no way am I buying GPU with such power demands.
Posted on Reply
#2
TheLostSwede
News Editor
RedelZaVednoRecommended to use 850W power supply... Is 3090 an induction heater? F this... no way am I buying GPU with such power demands.
No, it's a fan heater...

Posted on Reply
#3
dinmaster
I think the gpu will come with the adapter in the box like in the past with molex to pcie 6 pin. Stupid if not..
Posted on Reply
#4
dj-electric
If all we have is 6 12V+ and 6 12V-, ill be wiring my own, dont need no premium cables to do it for me.
1$ connector and an ability to extract existing ones off my power supply into it is all i would need.
Posted on Reply
#6
techboj
dj-electricIf all we have is 6 12V+ and 6 12V-, ill be wiring my own, dont need no premium cables to do it for me.
1$ connector and an ability to extract existing ones off my power supply into it is all i would need.
You could technically take the correct power wires from the existing but you would need the Nvidia "end" which has the proprietary connectors, pin arrangement and keyed socket
Posted on Reply
#7
dj-electric
techbojYou could technically take the correct power wires from the existing but you would need the Nvidia "end" which has the proprietary connectors, pin arrangement and keyed socket
Isnt this just a standard 0430251200 Molex connector? that would suck.
It does look like a standard 0430251200. Nothing proprietary here. Wiring this would take a quick minute.
Posted on Reply
#8
Dristun
Didn't we see leaked boards with triple 8-pin? Maybe the FE ones will ship with this connector but otherwise, like, who cares. Just hope the MSRP for partner boards is still 700$ for xx80 level.
Posted on Reply
#9
TheLostSwede
News Editor
dj-electricIsnt this just a standard 0430251200 Molex connector? that would suck.
It does look like a standard 0430251200. Nothing proprietary here. Wiring this would take a quick minute.
I have to ask, how did you know that connector even exists?
Posted on Reply
#11
ZoneDymo
so motherboards are going 12v and now videocards as well, interesting
Posted on Reply
#12
Turmania
Nvidia for the last decade, went for less power and more performance route and has been highly succesfull with this approach. So in general it does not make sense they will sacrifice power consumption figures for more power. But, we will know for sure when review comes up.
Posted on Reply
#13
jayseearr
RedelZaVednoRecommended to use 850W power supply... Is 3090 an induction heater? F this... no way am I buying GPU with such power demands.
I feel you there^
The thing is, (realistically) anybody who is willing to pay the price of these cards will not be breaking the bank by spending a couple hundred more on a psu
Posted on Reply
#14
Aquinus
Resident Wat-man
Leave it to nVidia to re-invent their own special wheel and sell it as progress.
Posted on Reply
#15
PowerPC
This is the price you have to pay for that smooth 4K gaming experience. People who complain about how big this card is or how much power it will suck at max or even the price, don't seem to understand what this card is geared to. This might be the first real 4K gaming card that actually deserves the name. But I'll reserve judgment until launch, which should be very soon.
Posted on Reply
#16
RedelZaVedno
It's not just the price, it's the heat such GPU emits too. Imagine playing something like FS 2020 which stresses GPU to the max all the time plus add 140W for a typical intel 8 core K CPU gaming power consumption and you're at +500W real power consumption. This makes gaming impossible in warmer climate during summer months and I'm definitely not installing AC just to use PC.
Posted on Reply
#17
jayseearr
RedelZaVednoI'm definitely not installing AC just to use PC.
Well then you clearly aren't in the "I don't give a sh*t" bracket that this card is targeted for :D
Posted on Reply
#18
Lionheart
RIP! guess I'm aiming for the 3080's / 3070's instead.
Posted on Reply
#19
Bones
RedelZaVednoFermi Deja Vu all over again :(
Who needs gas when you can cook with Nvidia?

Now available with the standard or the heart-healthy "George Foreman" grilling machine cooler.......
Great for big Lanparty luncheons!

Remember - The more you buy, the more steak you can grill!
Posted on Reply
#20
Apocalypsee
RedelZaVednoFermi Deja Vu all over again :(
I seen this before, but never would have thought there is an animated GIF version :laugh:
Posted on Reply
#21
Nkd
PowerPCThis is the price you have to pay for that smooth 4K gaming experience. People who complain about how big this card is or how much power it will suck at max or even the price, don't seem to understand what this card is geared to. This might be the first real 4K gaming card that actually deserves the name. But I'll reserve judgment until launch, which should be very soon.
Leave it up to someone to justify it. Hey just say it like it is. Nvidia has always been all about efficiency for a while now. They had to push this card and only reason I see this is because big Navi is probably competitive and Nvidia had to screw power envelope to keep performance crown according to whatever they know internally.
Posted on Reply
#22
Aquinus
Resident Wat-man
PowerPCThis is the price you have to pay for that smooth 4K gaming experience. People who complain about how big this card is or how much power it will suck at max or even the price, don't seem to understand what this card is geared to. This might be the first real 4K gaming card that actually deserves the name. But I'll reserve judgment until launch, which should be very soon.
...except it's really no different than two 6-pin connectors. :laugh:
Posted on Reply
#23
Unregistered
DristunDidn't we see leaked boards with triple 8-pin? Maybe the FE ones will ship with this connector but otherwise, like, who cares. Just hope the MSRP for partner boards is still 700$ for xx80 level.
With a performance bump, not like the 2080 that struggled to beat the 1080ti while costing more.
At this price it should comprehensibly beat the 2080ti.
Posted on Edit | Reply
#24
PowerPC
RedelZaVednoIt's not just the price, it's the heat such GPU emits too. Imagine playing something like FS 2020 which stresses GPU to the max all the time plus add 140W for a typical intel 8 core K CPU gaming power consumption and you're at +500W real power consumption. This makes gaming impossible in warmer climate during summer months and I'm definitely not installing AC just to use PC.
We don't even know yet how hot it will get. That bigger cooler might be there for a good reason.
Posted on Reply
#25
rtwjunkie
PC Gaming Enthusiast
LionheartRIP! guess I'm aiming for the 3080's / 3070's instead.
LOL, I may lower my standards and just buy a used 2080Ti. :shadedshu:
Posted on Reply
Add your own comment
Dec 21st, 2024 13:34 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts