Sunday, August 23rd 2020

Picture Proof of NVIDIA 12-pin Power Connector: Seasonic Ships Modular Adapters

Leading PSU manufacturer Seasonic is shipping a modular cable that confirms NVIDIA's proprietary 12-pin graphics card power connector for its upcoming GeForce "Ampere" graphics cards. Back in July we did an in-depth analysis of the connector, backed by confirmation from various industry sources about the connector being real, being a proprietary design by NVIDIA (and not a PCI-SIG or ATX standard), and its possible power output limit being 600 W. Seasonic's adapter converts two versatile 12 V 8-pin PSU-side connectors into one 12-pin connector, which tends to back the power output information. On typical Seasonic modular PSUs, cables are included to convert one PSU-side 8-pin 12 V connector into two 6+2 pin PCIe power connectors along a single cable. HardwareLuxxx.de reports that it's already received the Seasonic adapter in preparation for its "Ampere" Founders Edition reviews.

NVIDIA leaker with an extremely high hit-rate, kopite7kimi, however predicts that the 12-pin connector was designed by NVIDIA exclusively for its Founders Edition (reference design) graphics cards, and that custom-design cards may stick to industry-standard PCIe power connectors. We recently spied a custom-design RTX 3090 PCB, and it's shown featuring three 8-pin PCIe power connectors. This seems to be additional proof that a single 12-pin connector is a really fat straw for 12 V juice. The label on the box for the Seasonic cable reads that it's recommended to use the cable with PSUs with at least 850 W output (which could very well be a system requirement for the RTX 3090). Earlier this weekend, pictures of the RTX 3090 Founders Edition surfaced, and it is huge.
Source: VideoCardz
Add your own comment

119 Comments on Picture Proof of NVIDIA 12-pin Power Connector: Seasonic Ships Modular Adapters

#51
ThrashZone
thesmokingmanThey're gonna turn off a lot of buyers with this so i don't see how someone has money hungry as Jensen would allow this to happen.
Hi,
Only thing I've noticed that really "turns off" buyers is gpu price not a power adapter or new cable that's chicklets compared.
Posted on Reply
#52
thesmokingman
ThrashZoneHi,
Only thing I've noticed that really "turns off" buyers is gpu price not a power adapter or new cable that's chicklets compared.
Apparently ppl like to complain about price yet they still be buying these gpus at tripled prices. Nvidia didn't get here alone.
Posted on Reply
#53
Space Lynx
Astronaut
I have a feeling the 3090 exists only because RDNA2 is going to tie/beat a 2080 ti, so Nvidia felt they had to go full hog on the 3090 just so benches will show nvidia is still king. but in reality most people will only buy the 3080 due to price. meh we will see.

I'm still going with 4800x and Big Navi on their respective release dates. I love my current all AMD system. as long as I don't oc the gpu I have no issues. and fair enough, I'm to old and tired to care about that crap anymore.
Posted on Reply
#54
Searing
Relax people, the recommended PSU number means nothing, that is to help people with bad or double rail PSUs. I'll be perfectly able to use an RTX 3090 with a gold 550W supply. The only thing that really trips power supplies is Intel turbo (65W suddenly -> 250W -> 65W in a few seconds). If you limit your max Intel turbo power draw to 100W (which makes no difference in gaming, I've never gotten close to 100W in gaming) then it is easy to use a 550W supply (400W available). You just need knowledge and quality.
Posted on Reply
#55
blued
R0H1TPart of the reason is they didn't need to sacrifice efficiency for absolute performance (leadership) especially after the GTX 480/580 infernos...
The GTX580 (v2 Fermi) was a 'corrected version' of the 480 and solved the heat issue. Later to come on the other side was Hawaii (290/290x), basically AMDs own 'infernos' which were just as bad on the heat/noise front. Odd how ppl seem to forget that.

The 3090 imo is basically Nvidias 'doomsday weapon', an all out effort to ensure it maintains the top spot in all review sites benchmark charts, to basically solidify its 'mind share' in case big Navi is better than expected. And I believe Nvidia is taking a no holds barred approach with it, and price, power, heat be damned.

I believe there was supposed to be a 3080ti, but they temporarily may have set that aside to focus on something that will virtually guarantee beats big Navi. But that leaves too big of a gap between the 3090 and 3080, so pretty sure they will come out with a 3080ti at some later point after big Navi is released.


[/QUOTE]
Posted on Reply
#56
ODOGG26
bluedThe GTX580 (v2 Fermi) was a 'corrected version' of the 480 and solved the heat issue. Later to come on the other side was Hawaii (290/290x), basically AMDs own 'infernos' which were just as bad on the heat/noise front. Odd how ppl seem to forget that.

The 3090 imo is basically Nvidias 'doomsday weapon', an all out effort to ensure it maintains the top spot in all review sites benchmark charts, to basically solidify its 'mind share' in case big Navi is better than expected. And I believe Nvidia is taking a no holds barred approach with it, and price, power, heat be damned.

I believe there was supposed to be a 3080ti, but they temporarily may have set that aside to focus on something that will virtually guarantee beats big Navi. But that leaves too big of a gap between the 3090 and 3080, so pretty sure they will come out with a 3080ti at some later point after big Navi is released.
I thought in the 290/290x case it was because of the crap cheapo reference cooler and not because it was an actual inferno like the 480/580.
Posted on Reply
#57
zlobby
Another day, another proprietary connector. Well done, nvidia...
Posted on Reply
#58
JustAnEngineer
If your Intel Comet Lake furnace and your NVidia Ampere furnace are both pulling maximum amperes at the same time, you may very well need an 850-watt power supply. The 660-watt PSUs that I've used for most of my family's PCs in the past decade might not be adequate for that load.

What this means for me is that I will swap my Focus PX-850 power supply into the new PC that I'm building to house a yet-to-be-released graphics card and I'll put the spare SS-660XP2 Platinum in the old PC since it "only" has to power a Vega64. :D I'm willing to pay for the extra electricity for the graphics card and the air conditioning while gaming. I am much more concerned about getting adequate case ventilation to handle that much heat from a graphics card without resorting to dustbuster noise levels.

Going forward, I will also revise my current usual PSU recommendation of a Prime PX-750, which has the same compact 140 mm depth as the entire Focus series. The Prime PX-850 and 1000 take up an additional 30 mm.
Posted on Reply
#59
Searing
JustAnEngineerIf your Intel Comet Lake furnace and your NVidia Ampere furnace are both pulling maximum amperes at the same time, you may very well need an 850-watt power supply. The 660-watt PSUs that I've used for most of my family's PCs in the past decade might not be adequate for that load.

What this means for me is that I will swap my Focus PX-850 power supply into the new PC that I'm building to house a yet-to-be-released graphics card and I'll put the spare SS-660XP2 Platinum in the old PC since it "only" has to power a Vega64. :D I'm willing to pay for the extra electricity for the graphics card and the air conditioning while gaming. I am much more concerned about getting adequate case ventilation to handle that much heat from a graphics card without resorting to dustbuster noise levels.

Going forward, I will also revise my current usual PSU recommendation of a Prime PX-750, which has the same compact 140 mm depth as the entire Focus series. The Prime PX-850 and 1000 take up an additional 30 mm.
No. Single card systems don't need nearly that much power for gaming. Vega 64 is the worst card I've ever seen for power, I'm not sure yet if nVidia will exceed that card. You have to do very heavy CPU OCing to run out of power with a 660W P.
Posted on Reply
#60
Fluffmeister
ODOGG26I thought in the 290/290x case it was because of the crap cheapo reference cooler and not because it was an actual inferno like the 480/580.
The power consumption was right up there, so the heat output would be just the same too:

www.techpowerup.com/review/amd-r9-290x/25.html

Things went truly mental on the rebranded 390X cards:

www.techpowerup.com/review/msi-r9-390x-gaming/28.html

But hey, Hawaii was a good name because it's hot there too.
Posted on Reply
#61
Krzych
Cannot see any problem here. This connector is almost as small as single 8-pin. If you can get a full new cable for a modular PSU and not an adapter then this is an upgrade all around.
lynx29I have a feeling the 3090 exists only because RDNA2 is going to tie/beat a 2080 ti, so Nvidia felt they had to go full hog on the 3090 just so benches will show nvidia is still king. but in reality most people will only buy the 3080 due to price. meh we will see.

I'm still going with 4800x and Big Navi on their respective release dates. I love my current all AMD system. as long as I don't oc the gpu I have no issues. and fair enough, I'm to old and tired to care about that crap anymore.
Well at least you admit that you are complaining because you are old and tired, thats respectable :P It would be great if all malcontents here admited that they are complaining because they want new cards to be bad because it fits their story and state of mind.
Posted on Reply
#62
R0H1T
PowerPCBecause if rumors of this are true, and it really offers something ridiculous like 50% more performance at half the power of Turing, which would mean it'll be like 100% faster per Watt than 2080 ti, using way more Watts...
No it's 3x more efficient then, which as I posted in the other rumor threads previously ~ is not going to happen!
Posted on Reply
#63
blued
ODOGG26I thought in the 290/290x case it was because of the crap cheapo reference cooler and not because it was an actual inferno like the 480/580.
As mentioned the 580 was a corrected version of Fermi.


You mean the "crap cheapo reference cooler" of Hawaii vs the "crap cheapo reference cooler" of Fermi? Got it.



Bottom line, AMD has been more associated with hot, loud and power hungry over last 10 years than Nvidia has in terms of ref coolers.
Posted on Reply
#64
PowerPC
R0H1TNo it's 3x more efficient then, which as I posted in the other rumor threads previously ~ is not going to happen!
You're right. That's one of the rumors I read. There were also more conservative rumors for this card saying it's 60%-90% faster than 2080 ti, so still an extremely large difference, even if it's on the lowest end of this rumor. Just 60% faster would be insane.
Posted on Reply
#66
semantics
I wonder how much that 850w recommended is because the 12pin is supposed to be able to carry up to 600w
So it's probably not smart to run such a connector on a 600w power supply.
Posted on Reply
#67
my_name_is_earl
The hottest card I ever had is the ATI dual GPU in a single card... I think it was the 29xxx series. You really don't need a space heater with that card.
Posted on Reply
#68
Aquinus
Resident Wat-man
bluedLater to come on the other side was Hawaii (290/290x), basically AMDs own 'infernos' which were just as bad on the heat/noise front. Odd how ppl seem to forget that.
The R9 390 I had ran hot, but I don't remember it catching on fire. Just saying.
Posted on Reply
#69
AsRock
TPU addict
ZoneDymoso motherboards are going 12v and now videocards as well, interesting
Mobo's going 12v ?, i was sure it was the 5v and 3.3v Intel is trying to push.
Posted on Reply
#70
Krzych
PowerPCYou're right. That's one of the rumors I read. There were also more conservative rumors for this card saying it's 60%-90% faster than 2080 ti, so still an extremely large difference, even if it's on the lowest end of this rumor. Just 60% faster would be insane.
60% is not that unreasonable considering that Ampere succeeds the generation that didn't make big leaps in raw performance and didn't have node shrink or much better memory. Ampere gets massive upgrades all around, if it doesn't bring massive gains then nothing will anymore.
Posted on Reply
#71
Searing
Krzych60% is not that unreasonable considering that Ampere succeeds the generation that didn't make big leaps in raw performance and didn't have node shrink or much better memory. Ampere gets massive upgrades all around, if it doesn't bring massive gains then nothing will anymore.
And it has also been two years, the longest time between upgrades ever.
Posted on Reply
#72
Unregistered
Yeah interesting - but it won't make much difference to those that always require the highest performing gaming PCs.

...unless those that required the fastest performing gaming PCs, suddenly don't.
Posted on Edit | Reply
#73
Rob94hawk
Should I just convert my outlet to 220V now?
Posted on Reply
#74
AnarchoPrimitiv
I've heard rumors that the 3090 will use 390+ watts, guess that's adding up to be true
SearingNo. Single card systems don't need nearly that much power for gaming. Vega 64 is the worst card I've ever seen for power, I'm not sure yet if nVidia will exceed that card. You have to do very heavy CPU OCing to run out of power with a 660W P.
Like I said, read rumors that the 3090 will use 390+ watts
TurmaniaNvidia for the last decade, went for less power and more performance route and has been highly succesfull with this approach. So in general it does not make sense they will sacrifice power consumption figures for more power. But, we will know for sure when review comes up.
It does make sense if RDNA2 is going g to be good... I've heard rumors that while RDNA2 might night get the performance crown, that it will win the low high end/upper mid tier segment based on its efficiency and performance, that the second biggest Navi could be the real star of the next gen cards.
Posted on Reply
#75
Agentbb007
Wow I’m glad I bought a 1000 watt power supply. The EVGA 1000 P2 is going to be in for a workout.
Posted on Reply
Add your own comment
Nov 30th, 2024 23:53 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts