• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA RTX 50-series "Blackwell" to Debut 16-pin PCIe Gen 6 Power Connector Standard

R.I.P. ATX 3.0 PSU buyers.
 
Jesus, can we maybe stop hopping standards all the time? This kinda makes the whole “standard” thing pointless. I don’t care what NV chooses, but at least stick with it.
 
I think that above 500w gpu power consumption or even 400w ... we seriously need to re-think "tech evolution", because powering Framerates with Watts shouln'd be the answer for "next-gen".
at somepoint we have PC with only GPU's with a SSD and a powersupply. Components like MB, RAM etc wil be pointless....
 
Having the ability to supply x amount of power doesn't automatically mean components using the connector will max it out. The 4090 uses a power connector rated for 600 W, but the typical max power draw is ~450 W, and the 4080 which uses the same connector stays below 350 W.
To be honest. 480W at the GPU means 40A at 12V. That is a real huge current value. Also 30A are real big. Just imagine 50mA at the heart are enough to stop a heart beating.
R.I.P. ATX 3.0 PSU buyers.
I got one for my actual rig. But there one gets a 2*8 PIN PCIe Adapter for using the new nVidia Power outlet at the PSU. ;)
 
Probably have a 24pin by then, and require a 25A breaker to run a top of the line gamingpc

No one will be able to afford more than a $799 xx60 by then so who cares what we’re using for a power connector, we’ll all be priced out of the PC gaming market anyways.
 
So it equals to Nvidia admitting the failure of the 12vhpwr.
What a drama of Nvidia trying to push just one their proprietary connector to be a public standard.
And now Nvidia wants to do it again?
 
SO anyone who bought a new PSU wiht the "new" 12V connector basically got Jebaited
Naaaah its fine man! 12VHPWR is fine too man! User error! Nvidia knows best! PCI sig is responsible! There is no problem! Connector ain't seated well, push it in proper snowflake! They'll make a nice little adapter for it again come Blackwell, 12VHPR to 16pin. I hear Cablemods is already working on perfecting their design, seeing as they need their sweet time given recent experiences.

This is fine.
Early adopting is fine.

(ICYMI, /s)

Jesus, can we maybe stop hopping standards all the time? This kinda makes the whole “standard” thing pointless. I don’t care what NV chooses, but at least stick with it.
Mandatory xkcd

1708356002661.png
 
We are all competeng standards as humans, like demonstration models pac-manning about. 10 billion of them. The clunky connector has no business On the PSU side. But no, it needs to know it has a GPU connected a how much power to reserve. Why does it need to know, therefore the presence of signaling pins. The PSU that has it is in very experimental stage so it could be discarded as a transitional period. Too bad but they all have a ton of connectors like SATA cables that very few are using, so it just a reminder of old tech. PSU is still good not obsolete.
 
With 16 pins, does that mean the connector supplies up to 800 W? Since the 4090 goes up to almost 500 W, does that mean the 5090 will need over 600 W? I would not be surprised by this as the 3 nm node is nowhere near ready for this kind of chip so that means Blackwell will be on the same 4 nm process.
Hopefully this means they have built in extra redundancy. 8-pin connectors have 30%-60% additional wattage capacity over their rated 150w load, the 12VHPWR only has 10% or so.

The extra capacity can be helpful if, say, one pin doesn’t make full contact and the other pins need to increase load to compensate.

Also, while there are 600w bios’s out there for 4090’s, short of shunting the power circuits on the card the most they pull is ~550w, and even that is sort of a waste as the performance difference between that and a 450w bios, which many entry level 4090’s have, is only a couple a percent.

4090 is really a 450w card. It can take an extra 100w if you want to squeeze the last couple percent out, but it’s not really indicative of the card. Heck, many 4090 users under volt their cards to 300w-350w and keep 90%-95% of the performance of the 450w config.
 
Last edited:
A change to make it safer... everyone complains... lol

The world is so fucked.
 
You didn't seriously just use Moore's law is dead as a source

-Yeah, I'm starting to understand why MLID is a thing. People don't want actual leaks, they just want controversy to argue around.

The market has a need (theory crafting tech stuff) and MLID (as well as other equally prolific and unreliable "leakers") fill that need by partly making up stuff and throwing it out to the masses ravenous for info.

Even aggregator sites like TPU and Techspot (and more) have started laundering this content to give it an air of validity because their users eat this stuff up and it gets page hits.

Whole industry getting built around BS leaks, it's where the money is.
 
Update 15:48 UTC: Our friends at Hardware Busters have reliable sources in the power supply industry with equal access to the PCIe CEM specification as NVIDIA, and say that the story of NVIDIA adopting a new power connector with "Blackwell" is likely false. NVIDIA is expected to debut the new GPU series toward the end of 2024, and if a new power connector was in the offing, by now the power supply industry would have some clue. It doesn't. Read more about this in the Hardware Busters article in the source link below.
Welp, not surprised
 
-Yeah, I'm starting to understand why MLID is a thing. People don't want actual leaks, they just want controversy to argue around.

The market has a need (theory crafting tech stuff) and MLID (as well as other equally prolific and unreliable "leakers") fill that need by partly making up stuff and throwing it out to the masses ravenous for info.

Even aggregator sites like TPU and Techspot (and more) have started laundering this content to give it an air of validity because their users eat this stuff up and it gets page hits.

Whole industry getting built around BS leaks, it's where the money is.
Yep, even more worrying is that TPU made up this "16 power pins" connector. MLID doesn't say that Maxwell will use a new connector, he just mentions the 12V-2x6 as the 16 pins Nvidia connector. You know 16 pins as 2x6+4=16. I couldn't find anywhere in the transcript of the MLID video a "leak" of a 2x8 power pins.
 
and if you ever need more power, you can always use a pair of them!

No thanks, using one is risky enough....
 
A change to make it safer... everyone complains... lol

The world is so fucked.
The world is more fucked when it comes to thinking how this ever passed common sense QA. Also Im still in wait and see mode wrt this happening. The source is MLID.

None of this should have ever happened. The 4090 is the only card that would need an extra 8 pin. And its a fucking brick sized GPU. Space enough.

Its always good to keep going back to the why. Who benefits here? The 4090 with 12VHPWR adapter is too wide for most cases. Its fucking useless. Its placed in the wrong location. Need we go on?
 
Maybe the 16 pin was on the table before NVIDIA knew AMD wasn't making a high end RDNA4, but it seems NVIDIA has dropped their multi-chip flagship card as there will be no competition for them in the high end. A dual die halo SKU would probably have needed more than 600 W.
 
-Yeah, I'm starting to understand why MLID is a thing. People don't want actual leaks, they just want controversy to argue around.
Yeah except the part where he gaslights these fights> didn't he also delete a few videos after some of the "leaks" ended up worse than horse sh!t :rolleyes:

Sounds like a cheap version of The Daily Bugle Mail :shadedshu:
 
Welp, not surprised
I don't think he ever stated that it was an entirely new 16pin, the emphasis was that Nvidia is doubling down on the existing one. The confusion comes from the multiple names, is it a 12+4 or 16 or 2x6 elongated plus 4 retracted signaling pins that we don't count. It was better to just make it a 3x6 and dedicate the upper 1x6 line to communicate with the GPU, now it's just but ugly.
 
I want to say something but I can't find the words... because I'm speechless...
 
A new day, a new PSU connector standard.
 
Wouldn't really make sense, unless it's a 4x8pin to 16pin adapter.
Who cares if you get adapter with it ( ͡° ͜ʖ ͡° )つ──☆*:・゚ And peoples cry about makes me rethink intelligence of the people across the world (ʘ‿ʘ)

R.I.P. ATX 3.0 PSU buyers.
Why is that?? Is that mean all cable adapters stop exist or Ngreedia stop giving adapters with new card (ʘ‿ʘ)

I got one for my actual rig. But there one gets a 2*8 PIN PCIe Adapter for using the new nVidia Power outlet at the PSU. ;)
And you see how cry flakes cry about new connector when you have plenty adapters to chose from and even ngreedia give you one with new GPU
 
And you see how cry flakes cry about new connector when you have plenty adapters to chose from and even ngreedia give you one with new GPU
Why should a company produces adapters like no good if there is no need for. Useless spent money. Useless used resources and material. Unseless needed power to produce.

Edit:
Every adapter no matter which kind, has production costs. The producer of the PSU or nVidia don't pay for that. They charge it from the customer. Don't even dream that you get them for free.
 
Last edited:
nobody want upgrade psu every year
 
Reality: People will need to use more adaptors, wich will result in more points of failure, more melting adaptors and connectors. what a great solution.
 
Back
Top