Monday, February 19th 2024

NVIDIA RTX 50-series "Blackwell" to Debut 16-pin PCIe Gen 6 Power Connector Standard

NVIDIA is reportedly looking to change the power connector standard for the fourth successive time in a span of three years, with its upcoming GeForce RTX 50-series "Blackwell" GPUs, Moore's Law is Dead reports. NVIDIA began its post 8-pin PCIe journey with the 12-pin Molex MicroFit connector for the GeForce RTX 3080 and RTX 3090 Founders Edition cards. The RTX 3090 Ti would go on to standardize the 12VHPWR connector, which the company would debut across a wider section of its GeForce RTX 40-series "Ada" product stack (all SKUs with TGP of over 200 W). In the face of rising complains of the reliability of 12VHPWR, some partner RTX 40-series cards are beginning to implement the pin-compatible but sturdier 12V-2x6. The implementation of the 16-pin PCIe Gen 6 connector would be the fourth power connector change, if the rumors are true. A different source says that rival AMD has no plans to change from the classic 8-pin PCIe power connectors.

Update 15:48 UTC: Our friends at Hardware Busters have reliable sources in the power supply industry with equal access to the PCIe CEM specification as NVIDIA, and say that the story of NVIDIA adopting a new power connector with "Blackwell" is likely false. NVIDIA is expected to debut the new GPU series toward the end of 2024, and if a new power connector was in the offing, by now the power supply industry would have some clue. It doesn't. Read more about this in the Hardware Busters article in the source link below.

Update Feb 20th: In an earlier version of the article, it was incorrectly reported that the "16-pin connector" is fundamentally different from the current 12V-2x6, with 16 pins dedicated to power delivery. We have since been corrected by Moore's Law is Dead, that it is in fact the same 12V-2x6, but with an updated PCIe 6.0 CEM specification.
Sources: Moore's Law is Dead, Hardware Busters
Add your own comment

106 Comments on NVIDIA RTX 50-series "Blackwell" to Debut 16-pin PCIe Gen 6 Power Connector Standard

#26
Onasi
Jesus, can we maybe stop hopping standards all the time? This kinda makes the whole “standard” thing pointless. I don’t care what NV chooses, but at least stick with it.
Posted on Reply
#27
Kn0xxPT
I think that above 500w gpu power consumption or even 400w ... we seriously need to re-think "tech evolution", because powering Framerates with Watts shouln'd be the answer for "next-gen".
at somepoint we have PC with only GPU's with a SSD and a powersupply. Components like MB, RAM etc wil be pointless....
Posted on Reply
#28
gurusmi
dgianstefaniHaving the ability to supply x amount of power doesn't automatically mean components using the connector will max it out. The 4090 uses a power connector rated for 600 W, but the typical max power draw is ~450 W, and the 4080 which uses the same connector stays below 350 W.
To be honest. 480W at the GPU means 40A at 12V. That is a real huge current value. Also 30A are real big. Just imagine 50mA at the heart are enough to stop a heart beating.
ErikGR.I.P. ATX 3.0 PSU buyers.
I got one for my actual rig. But there one gets a 2*8 PIN PCIe Adapter for using the new nVidia Power outlet at the PSU. ;)
Posted on Reply
#29
rv8000
pat-ronerProbably have a 24pin by then, and require a 25A breaker to run a top of the line gamingpc
No one will be able to afford more than a $799 xx60 by then so who cares what we’re using for a power connector, we’ll all be priced out of the PC gaming market anyways.
Posted on Reply
#30
Crackong
So it equals to Nvidia admitting the failure of the 12vhpwr.
What a drama of Nvidia trying to push just one their proprietary connector to be a public standard.
And now Nvidia wants to do it again?
Posted on Reply
#31
Vayra86
Panther_SeraphinSO anyone who bought a new PSU wiht the "new" 12V connector basically got Jebaited
Naaaah its fine man! 12VHPWR is fine too man! User error! Nvidia knows best! PCI sig is responsible! There is no problem! Connector ain't seated well, push it in proper snowflake! They'll make a nice little adapter for it again come Blackwell, 12VHPR to 16pin. I hear Cablemods is already working on perfecting their design, seeing as they need their sweet time given recent experiences.

This is fine.
Early adopting is fine.

(ICYMI, /s)
OnasiJesus, can we maybe stop hopping standards all the time? This kinda makes the whole “standard” thing pointless. I don’t care what NV chooses, but at least stick with it.
Mandatory xkcd

Posted on Reply
#32
N/A
We are all competeng standards as humans, like demonstration models pac-manning about. 10 billion of them. The clunky connector has no business On the PSU side. But no, it needs to know it has a GPU connected a how much power to reserve. Why does it need to know, therefore the presence of signaling pins. The PSU that has it is in very experimental stage so it could be discarded as a transitional period. Too bad but they all have a ton of connectors like SATA cables that very few are using, so it just a reminder of old tech. PSU is still good not obsolete.
Posted on Reply
#33
RahkSha
DavenWith 16 pins, does that mean the connector supplies up to 800 W? Since the 4090 goes up to almost 500 W, does that mean the 5090 will need over 600 W? I would not be surprised by this as the 3 nm node is nowhere near ready for this kind of chip so that means Blackwell will be on the same 4 nm process.
Hopefully this means they have built in extra redundancy. 8-pin connectors have 30%-60% additional wattage capacity over their rated 150w load, the 12VHPWR only has 10% or so.

The extra capacity can be helpful if, say, one pin doesn’t make full contact and the other pins need to increase load to compensate.

Also, while there are 600w bios’s out there for 4090’s, short of shunting the power circuits on the card the most they pull is ~550w, and even that is sort of a waste as the performance difference between that and a 450w bios, which many entry level 4090’s have, is only a couple a percent.

4090 is really a 450w card. It can take an extra 100w if you want to squeeze the last couple percent out, but it’s not really indicative of the card. Heck, many 4090 users under volt their cards to 300w-350w and keep 90%-95% of the performance of the 450w config.
Posted on Reply
#34
TheDeeGee
A change to make it safer... everyone complains... lol

The world is so fucked.
Posted on Reply
#35
GodisanAtheist
spcyslsYou didn't seriously just use Moore's law is dead as a source
-Yeah, I'm starting to understand why MLID is a thing. People don't want actual leaks, they just want controversy to argue around.

The market has a need (theory crafting tech stuff) and MLID (as well as other equally prolific and unreliable "leakers") fill that need by partly making up stuff and throwing it out to the masses ravenous for info.

Even aggregator sites like TPU and Techspot (and more) have started laundering this content to give it an air of validity because their users eat this stuff up and it gets page hits.

Whole industry getting built around BS leaks, it's where the money is.
Posted on Reply
#36
wNotyarD
TheDeeGeeA change to make it safer... everyone complains... lol

The world is so fucked.
Posted on Reply
#37
3x0
btarunrUpdate 15:48 UTC: Our friends at Hardware Busters have reliable sources in the power supply industry with equal access to the PCIe CEM specification as NVIDIA, and say that the story of NVIDIA adopting a new power connector with "Blackwell" is likely false. NVIDIA is expected to debut the new GPU series toward the end of 2024, and if a new power connector was in the offing, by now the power supply industry would have some clue. It doesn't. Read more about this in the Hardware Busters article in the source link below.
Welp, not surprised
Posted on Reply
#38
CyberPomPom
GodisanAtheist-Yeah, I'm starting to understand why MLID is a thing. People don't want actual leaks, they just want controversy to argue around.

The market has a need (theory crafting tech stuff) and MLID (as well as other equally prolific and unreliable "leakers") fill that need by partly making up stuff and throwing it out to the masses ravenous for info.

Even aggregator sites like TPU and Techspot (and more) have started laundering this content to give it an air of validity because their users eat this stuff up and it gets page hits.

Whole industry getting built around BS leaks, it's where the money is.
Yep, even more worrying is that TPU made up this "16 power pins" connector. MLID doesn't say that Maxwell will use a new connector, he just mentions the 12V-2x6 as the 16 pins Nvidia connector. You know 16 pins as 2x6+4=16. I couldn't find anywhere in the transcript of the MLID video a "leak" of a 2x8 power pins.
Posted on Reply
#39
P4-630
and if you ever need more power, you can always use a pair of them!
No thanks, using one is risky enough....
Posted on Reply
#40
Vayra86
TheDeeGeeA change to make it safer... everyone complains... lol

The world is so fucked.
The world is more fucked when it comes to thinking how this ever passed common sense QA. Also Im still in wait and see mode wrt this happening. The source is MLID.

None of this should have ever happened. The 4090 is the only card that would need an extra 8 pin. And its a fucking brick sized GPU. Space enough.

Its always good to keep going back to the why. Who benefits here? The 4090 with 12VHPWR adapter is too wide for most cases. Its fucking useless. Its placed in the wrong location. Need we go on?
Posted on Reply
#41
dgianstefani
TPU Proofreader
Maybe the 16 pin was on the table before NVIDIA knew AMD wasn't making a high end RDNA4, but it seems NVIDIA has dropped their multi-chip flagship card as there will be no competition for them in the high end. A dual die halo SKU would probably have needed more than 600 W.
Posted on Reply
#42
R0H1T
GodisanAtheist-Yeah, I'm starting to understand why MLID is a thing. People don't want actual leaks, they just want controversy to argue around.
Yeah except the part where he gaslights these fights> didn't he also delete a few videos after some of the "leaks" ended up worse than horse sh!t :rolleyes:

Sounds like a cheap version of The Daily Bugle Mail :shadedshu:
Posted on Reply
#43
N/A
3x0Welp, not surprised
I don't think he ever stated that it was an entirely new 16pin, the emphasis was that Nvidia is doubling down on the existing one. The confusion comes from the multiple names, is it a 12+4 or 16 or 2x6 elongated plus 4 retracted signaling pins that we don't count. It was better to just make it a 3x6 and dedicate the upper 1x6 line to communicate with the GPU, now it's just but ugly.
Posted on Reply
#44
Sabotaged_Enigma
I want to say something but I can't find the words... because I'm speechless...
Posted on Reply
#45
Dirt Chip
A new day, a new PSU connector standard.
Posted on Reply
#46
DY69SX
PumperWouldn't really make sense, unless it's a 4x8pin to 16pin adapter.
Who cares if you get adapter with it ( ͡° ͜ʖ ͡° )つ──☆*:・゚ And peoples cry about makes me rethink intelligence of the people across the world (ʘ‿ʘ)
ErikGR.I.P. ATX 3.0 PSU buyers.
Why is that?? Is that mean all cable adapters stop exist or Ngreedia stop giving adapters with new card (ʘ‿ʘ)
gurusmiI got one for my actual rig. But there one gets a 2*8 PIN PCIe Adapter for using the new nVidia Power outlet at the PSU. ;)
And you see how cry flakes cry about new connector when you have plenty adapters to chose from and even ngreedia give you one with new GPU
Posted on Reply
#47
gurusmi
DY69SXAnd you see how cry flakes cry about new connector when you have plenty adapters to chose from and even ngreedia give you one with new GPU
Why should a company produces adapters like no good if there is no need for. Useless spent money. Useless used resources and material. Unseless needed power to produce.

Edit:
Every adapter no matter which kind, has production costs. The producer of the PSU or nVidia don't pay for that. They charge it from the customer. Don't even dream that you get them for free.
Posted on Reply
#48
shk021051
nobody want upgrade psu every year
Posted on Reply
#49
KarymidoN
Reality: People will need to use more adaptors, wich will result in more points of failure, more melting adaptors and connectors. what a great solution.
Posted on Reply
#50
DY69SX
gurusmiWhy should a company produces adapters like no good if there is no need for. Useless spent money. Useless used resources and material. Unseless needed power to produce.

Edit:
Every adapter no matter which kind, has production costs. The producer of the PSU or nVidia don't pay for that. They charge it from the customer. Don't even dream that you get them for free.
Wot The Hell you are taking about!! No one is forcing you to do anything! Yet!! And you are crying like your life depends on this!!

And to answer why adapter's are produced is to give you possibility of choice!! But anyway you won't buy RTX50 so why you are crying for
Posted on Reply
Add your own comment
Nov 28th, 2024 23:43 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts