Sunday, August 23rd 2020
Picture Proof of NVIDIA 12-pin Power Connector: Seasonic Ships Modular Adapters
Leading PSU manufacturer Seasonic is shipping a modular cable that confirms NVIDIA's proprietary 12-pin graphics card power connector for its upcoming GeForce "Ampere" graphics cards. Back in July we did an in-depth analysis of the connector, backed by confirmation from various industry sources about the connector being real, being a proprietary design by NVIDIA (and not a PCI-SIG or ATX standard), and its possible power output limit being 600 W. Seasonic's adapter converts two versatile 12 V 8-pin PSU-side connectors into one 12-pin connector, which tends to back the power output information. On typical Seasonic modular PSUs, cables are included to convert one PSU-side 8-pin 12 V connector into two 6+2 pin PCIe power connectors along a single cable. HardwareLuxxx.de reports that it's already received the Seasonic adapter in preparation for its "Ampere" Founders Edition reviews.
NVIDIA leaker with an extremely high hit-rate, kopite7kimi, however predicts that the 12-pin connector was designed by NVIDIA exclusively for its Founders Edition (reference design) graphics cards, and that custom-design cards may stick to industry-standard PCIe power connectors. We recently spied a custom-design RTX 3090 PCB, and it's shown featuring three 8-pin PCIe power connectors. This seems to be additional proof that a single 12-pin connector is a really fat straw for 12 V juice. The label on the box for the Seasonic cable reads that it's recommended to use the cable with PSUs with at least 850 W output (which could very well be a system requirement for the RTX 3090). Earlier this weekend, pictures of the RTX 3090 Founders Edition surfaced, and it is huge.
Source:
VideoCardz
NVIDIA leaker with an extremely high hit-rate, kopite7kimi, however predicts that the 12-pin connector was designed by NVIDIA exclusively for its Founders Edition (reference design) graphics cards, and that custom-design cards may stick to industry-standard PCIe power connectors. We recently spied a custom-design RTX 3090 PCB, and it's shown featuring three 8-pin PCIe power connectors. This seems to be additional proof that a single 12-pin connector is a really fat straw for 12 V juice. The label on the box for the Seasonic cable reads that it's recommended to use the cable with PSUs with at least 850 W output (which could very well be a system requirement for the RTX 3090). Earlier this weekend, pictures of the RTX 3090 Founders Edition surfaced, and it is huge.
119 Comments on Picture Proof of NVIDIA 12-pin Power Connector: Seasonic Ships Modular Adapters
But yeah 12-pin connector for RTX30 FEs seems to be certain and those most probably come with 2x8-pin pcie to 12-pin adapter.
Three 8-pin PCIe connectors at 150 watts each plus up to 75 watts from the motherboard through the PCIe x16 slot seems like a tremendous amount of power for a graphics card. Even if it's not using the maximum that that configuration can provide, consider that two 8-pin connectors plus the PCIe slot would have been good for up to 375 watts, and NVidia's engineers determined that it needed another 8-pin power connector. :eek:
Funny outrage over 850w psu lol :roll:
Chips are getting more and more cores even non-hedt chips like 10900k and even a lot of main stream amd chips 12 core and upwards so it just a number accounting for people using more than 4-6-8 cores and knowing their audience is overclocking everything now days even turbo is getting up there :-)
Bingo
So a 750w when most gpu's to date were 600-650w recommended worked out just fine for most people.
But being triggered over this is funny lol :)
I mean I have 1200P2 on x299 & 1000P2 on z490 I also have a 850P2 on x99
Yeah one could figure once nvidia said 2x8 was bulky and a pain to route for builders this 1-12 pin nonsense was just silly.
Hopefully real gpu makers will go on as they have always done 2x8 or on monsters like king|pin 3x8 and forget this nvidia 12 pin nonsense plug entirely :rockout:
RTX 3090xxx... THE world's first & only neutrino-based, graviton accelerater powered GPU's, featuring our patented, atmosphere-depleting quadruple super-duper LH2^⁹⁹ cooling system. And with an MSRP of a mere $5.48752MM USD, everyone who uses any computer anywhere for any purpose will be telepathically compelled to buy one, or 2, or maybe even12 !
Yep, you heard it here 1st folks,....
And yes, you're welcome ..:laugh:..:D..o_O
My argument still stays, 850W is overkill for a single-GPU and non-HEDT system.
Nvidia is not clairvoyant 850w is only a recommended spec not written in stone
Lowball for my 9940x at 4.8 & higher and likely lowball for my 10900k at 5.0 & higher I'm not triggered over that pretty much why I overkill stuff and not bare minimum anything but maybe case :-)
Because if rumors of this are true, and it really offers something ridiculous like 50% more performance at half the power of Turing, which would mean it'll be like 100% faster per Watt than 2080 ti, using way more Watts... It could literally demolish the 2080 ti. Why is nobody talking about this possibility? Even if it's just a rumor, it also kinda makes sense to me so far looking at the leaks of the size, cooling, and price of this thing. If it's like 90% faster than 2080 ti, many people won't be able to hold on to their wallets.
Not unusual for newest to perform higher than previous releases
Pretty much why it's best to wait and buy 1-2 years after release or buy one generation back so you can take advantage of depreciation from high first release prices
I did not buy 20 series thankfully just held out and yes it was tough seeing the benchmark numbers of 2080ti :eek:
Though typical gamer these days have something like 9700K, 10700K, AMD 2600-2700X or 3600-3700X. But I guess these are just opinions.
3090 is likely the titan rtx replacement seeing how much memory it might have so I'm guessing 2500.us range not 1400.us
1400.us is likely 3080ti price.
All this is rumors so no telling until actual release.
(2) It's possible AMD may launch RDNA3 in about a year from now, with TSMC's 7nm & 5nm capacities being nearly booked fully there's little chance Nvidia can compete with AMD, Intel in the medium term with an inferior(?) node. If that's the case then they're also in serious trouble because there could be a major price war much like we've seen in the CPU space. They're trying to make as much $ as possible upfront!