• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

It's happening again, melting 12v high pwr connectors

freeagent

Moderator
Staff member
Joined
Sep 16, 2018
Messages
9,704 (4.15/day)
Location
Winnipeg, Canada
Processor AMD R7 5800X3D
Motherboard Asus Crosshair VIII Dark Hero
Cooling Thermalright Frozen Edge 360, 3x TL-B12 V2, 2x TL-B12 V1
Memory 2x8 G.Skill Trident Z Royal 3200C14, 2x8GB G.Skill Trident Z Black and White 3200 C14
Video Card(s) Zotac 4070 Ti Trinity OC
Storage WD SN850 1TB, SN850X 2TB, SN770 1TB
Display(s) LG 50UP7100
Case Fractal Torrent Compact
Audio Device(s) JBL Bar 700
Power Supply Seasonic Vertex GX-1000, Monster HDP1800
Mouse Logitech G502 Hero
Keyboard Logitech G213
VR HMD Oculus 3
Software Yes
Benchmark Scores Yes
"Should not use", and "can not get" are 2 different things. What the chart shows is that you shouldn't use 600w for the gpu on a sub 1100w psu. That's not to say that you can't, but you shouldn't - it's just asking for trouble.
Ohhhh
 
Joined
Nov 16, 2023
Messages
1,820 (4.02/day)
Location
Nowhere
System Name I don't name my rig
Processor 14700K
Motherboard Asus TUF Z790
Cooling Air/water/DryIce
Memory DDR5 G.Skill Z5 RGB 6000mhz C36
Video Card(s) RTX 4070 Super
Storage 980 Pro
Display(s) Some LED 1080P TV
Case Open bench
Audio Device(s) Some Old Sherwood stereo and old cabinet speakers
Power Supply Corsair 1050w HX series
Mouse Razor Mamba Tournament Edition
Keyboard Logitech G910
VR HMD Quest 2
Software Windows
Benchmark Scores Max Freq 13700K 6.7ghz DryIce Max Freq 14700K 7.0ghz DryIce Max all time Freq FX-8300 7685mhz LN2
"Should not use", and "can not get" are 2 different things. What the chart shows is that you shouldn't use 600w for the gpu on a sub 1100w psu. That's not to say that you can't, but you shouldn't - it's just asking for trouble.
1500w or bust, muh cpu need loads too! :roll:
 
Joined
Jul 13, 2016
Messages
3,483 (1.11/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage P5800X 1.6TB 4x 15.36TB Micron 9300 Pro 4x WD Black 8TB M.2
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) JDS Element IV, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse PMM P-305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
Ugh…
Aftermarket 12vhpwr cables bad..
OEM 12vhpwr cables good.

What is the basis for this argument? Cards with OEM cables have burned up just the same as non-OEM cables.

What exactly would make all 3rd party cables bad? They follow the same spec as the OEM cables and no evidence has been put forth that these are examples of bad cables. There's no nuance to this argument, no specifics pointed out as to why any given 3rd party cable would be considered bad. The ironic part is some of those 3rd party cables may very well be manufactured in the same factory as the OEM cables. That's pretty common in China. The only difference is the logo.

Let's be honest, trusting that Nvidia's adapters are somehow better than everyone else's cables / adapters is foolhardy. Just in the 3000 series alone Nvidia had cards that bricked themselves, had massive power spikes, and that fed noise into the 12v sense pin that tripped OCP on some PSUs. Heck this wouldn't even be nearly as big an issue if Nvidia simply had included two connectors. Nvidia doesn't not have a good reputation when it comes to power delivery.
 

qxp

Joined
Oct 27, 2024
Messages
135 (1.26/day)
One problem with all the previous arguments about "bad cable" is that they are after the fact. It melted, so some are saying it is a bad cable. But for a lot of people the question is - you bought a card, and you got a cable and you connect them, will it melt ? The expectation is that if the card and the cable support the same standard, as these were, and you connected them, as the person did, it should work.

And how to do this safely was figured out for more than 100 years already and every electrical circuit that might melt or catch fire if some electrical piece malfunctions has a fuse. So either the connector should be designed that never gets hot enough to melt or there should have been fuses somewhere in the electrical chain.
 
Joined
Mar 7, 2007
Messages
3,992 (0.61/day)
Location
Maryland
System Name HAL
Processor Core i9 14900ks @5.9-6.3
Motherboard Z790 Dark Hero
Cooling Bitspower Summit SE & (2) 360 Corsair XR7 Rads push/pull
Memory 2x 32GB (64GB) Gskill trident 6000 CL30
Video Card(s) RTX 4090 Gigagbyte gaming OC @ +200/1300
Storage (M2's) 2x Samsung 980 pro 2TB, 1xWD Black 2TB, 1x SK Hynix Platinum P41 2TB
Display(s) 65" LG OLED 120HZ
Case Lian Li dyanmic Evo11 with distro plate
Audio Device(s) Klipsh 7.1 through Sony DH790 EARC.
Power Supply Thermaltake 1350
Software Microsoft Windows 11 x64
What is the basis for this argument? Cards with OEM cables have burned up just the same as non-OEM cables.

What exactly would make all 3rd party cables bad? They follow the same spec as the OEM cables and no evidence has been put forth that these are examples of bad cables. There's no nuance to this argument, no specifics pointed out as to why any given 3rd party cable would be considered bad. The ironic part is some of those 3rd party cables may very well be manufactured in the same factory as the OEM cables. That's pretty common in China. The only difference is the logo.

Let's be honest, trusting that Nvidia's adapters are somehow better than everyone else's cables / adapters is foolhardy. Just in the 3000 series alone Nvidia had cards that bricked themselves, had massive power spikes, and that fed noise into the 12v sense pin that tripped OCP on some PSUs. Heck this wouldn't even be nearly as big an issue if Nvidia simply had included two connectors. Nvidia doesn't not have a good reputation when it comes to power delivery.
The basis is that people are bending the wires, not making sure that the pins are secured and snapped in and using aftermarket cables, which is what’s leading to these issues.
I have a hard time believing people can keep willfully ignoring these facts but here we are.
 
Joined
Sep 3, 2019
Messages
3,815 (1.92/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 220W PPT limit, 85C temp limit, CO -8~14, +50MHz (up to 5.0GHz)
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F39b, AGESA V2 1.2.0.C
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off-center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3600MT/s 1.38V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~467W (382W current) PowerLimit, 1060mV, Adrenalin v24.12.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR400/1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, ATX v2.4, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v24H2, OSBuild 26100.3037), upgraded from Win10 to Win11 on Jan 2024
The basis is that people are bending the wires, not making sure that the pins are secured and snapped in and using aftermarket cables, which is what’s leading to these issues.
I have a hard time believing people can keep willfully ignoring these facts but here we are.
And I have hard time to believe that nVidia could have used electronic components that cost maybe less than $1 each to monitor the pin current and prevent this from happening, but here we are.
I find 2x "stupid" for "smart" engineers to not predict that ignorant and unthoughtful people will do things wrong. Yeah... because 8+billion people on this planet are never doing anything wrong except some RTX40/50 owners.
Unless something else is going on here.
 
Joined
Nov 16, 2023
Messages
1,820 (4.02/day)
Location
Nowhere
System Name I don't name my rig
Processor 14700K
Motherboard Asus TUF Z790
Cooling Air/water/DryIce
Memory DDR5 G.Skill Z5 RGB 6000mhz C36
Video Card(s) RTX 4070 Super
Storage 980 Pro
Display(s) Some LED 1080P TV
Case Open bench
Audio Device(s) Some Old Sherwood stereo and old cabinet speakers
Power Supply Corsair 1050w HX series
Mouse Razor Mamba Tournament Edition
Keyboard Logitech G910
VR HMD Quest 2
Software Windows
Benchmark Scores Max Freq 13700K 6.7ghz DryIce Max Freq 14700K 7.0ghz DryIce Max all time Freq FX-8300 7685mhz LN2
One problem with all the previous arguments about "bad cable" is that they are after the fact. It melted, so some are saying it is a bad cable. But for a lot of people the question is - you bought a card, and you got a cable and you connect them, will it melt ? The expectation is that if the card and the cable support the same standard, as these were, and you connected them, as the person did, it should work.

And how to do this safely was figured out for more than 100 years already and every electrical circuit that might melt or catch fire if some electrical piece malfunctions has a fuse. So either the connector should be designed that never gets hot enough to melt or there should have been fuses somewhere in the electrical chain.
Right, the use low temp plastics obviously. These connectors don't need to be red hot apparently.

It's a male/female connector.

-o

Some females may have had children. -O
Maybe females had too many males -O
Maybe the male went in on an angle -O
Maybe the user yanked too hard on the connector to cause perhaps one of the above scenes!

And then there's some tension and friction, the old man's on the couch again looking at new female ends to replace in the old molex.

Old wives tales....
 
Joined
Dec 31, 2020
Messages
1,134 (0.75/day)
Processor E5-4627 v4
Motherboard VEINEDA X99
Memory 32 GB
Video Card(s) 2080 Ti
Storage NE-512
Display(s) G27Q
Case DAOTECH X9
Power Supply SF450
This cable is a neglectful mess. User was asking for trouble. All receptacles were forced into angles by the zip tie, user couldn't be bothered to use a cable comb. And sleeved cables are way too big for this small connector.
 
Joined
Oct 19, 2022
Messages
335 (0.40/day)
Location
Los Angeles, CA
Processor AMD Ryzen 7 9800X3D (+PBO 5.4GHz)
Motherboard MSI MPG X870E Carbon Wifi
Cooling ARCTIC Liquid Freezer II 280 A-RGB
Memory 2x32GB (64GB) G.Skill Trident Z Royal @ 6400MHz 1:1 (30-38-38-30)
Video Card(s) MSI GeForce RTX 4090 SUPRIM Liquid X
Storage Crucial T705 4TB (PCIe 5.0) w/ Heatsink + Samsung 990 PRO 2TB (PCIe 4.0) w/ Heatsink
Display(s) AORUS FO32U2P 4K QD-OLED 240Hz (DP 2.1 UHBR20 80Gbps)
Case CoolerMaster H500M (Mesh)
Audio Device(s) AKG N90Q w/ AudioQuest DragonFly Red (USB DAC)
Power Supply Seasonic Prime TX-1600 Noctua Edition (1600W 80Plus Titanium) ATX 3.1 & PCIe 5.1
Mouse Logitech G PRO X SUPERLIGHT
Keyboard Razer BlackWidow V3 Pro
Software Windows 10 64-bit
The fact that he used a third party connector is not an excuse for this reoccurring issue.

If manufacturers seem unable to make products which work fine with these connectors time and time again then clearly it's not a problem just on their end, the design still sucks just as it did before.
Agree, 3rd party cable or not it should be safe! Period.
Knowing how few units of RTX 5090s have been sold globally, we won't know if the new connectors are really safe (or not) before a while...
 
Joined
Dec 31, 2020
Messages
1,134 (0.75/day)
Processor E5-4627 v4
Motherboard VEINEDA X99
Memory 32 GB
Video Card(s) 2080 Ti
Storage NE-512
Display(s) G27Q
Case DAOTECH X9
Power Supply SF450
this is how it should look
1739253815845.png

the user crossed them over and under almost like braided, jumped on top and zip tied them in that position. yeah, the third party should take that into account too, I guess
 
Joined
Oct 19, 2022
Messages
335 (0.40/day)
Location
Los Angeles, CA
Processor AMD Ryzen 7 9800X3D (+PBO 5.4GHz)
Motherboard MSI MPG X870E Carbon Wifi
Cooling ARCTIC Liquid Freezer II 280 A-RGB
Memory 2x32GB (64GB) G.Skill Trident Z Royal @ 6400MHz 1:1 (30-38-38-30)
Video Card(s) MSI GeForce RTX 4090 SUPRIM Liquid X
Storage Crucial T705 4TB (PCIe 5.0) w/ Heatsink + Samsung 990 PRO 2TB (PCIe 4.0) w/ Heatsink
Display(s) AORUS FO32U2P 4K QD-OLED 240Hz (DP 2.1 UHBR20 80Gbps)
Case CoolerMaster H500M (Mesh)
Audio Device(s) AKG N90Q w/ AudioQuest DragonFly Red (USB DAC)
Power Supply Seasonic Prime TX-1600 Noctua Edition (1600W 80Plus Titanium) ATX 3.1 & PCIe 5.1
Mouse Logitech G PRO X SUPERLIGHT
Keyboard Razer BlackWidow V3 Pro
Software Windows 10 64-bit
Because you can't fit 4x8pin connectors on this for a total of 600W

View attachment 384131

Why they choose this design instead of the more traditional one, I cannot tell.
I don't have a problem pushing tech to new things, as long as they done right and with the proper safety mechanisms like monitoring the current of such a small and high power connector.
Its just a few extra $$ but that would've pushed them to stick to larger PCB.

I am just stating the facts


Exactly double per pin of the traditional 8pin and that is a lot more
The problem is that Nvidia pushed for the 16-pin connector to reach 600W, but now that their GPUs can reach 600W they won't even allow AIBs to use 2x 16-pin connectors on Custom models...
Also I feel like Nvidia don't want AIBs to make some great Overclockable GPUs anymore which means less FE sales and lower margins.
 
Joined
Dec 2, 2024
Messages
26 (0.36/day)
Location
Wet Coast of Canada Eh!
System Name NON-AMD Folding Farm
Processor Intel i7 7700K-9700K-14700K
Motherboard Asus WS Z390 Pro's, Z270 P, TUF Z390-Plus Gaming Wifi's, Prime Z790 P Wifi
Cooling Coolermater 212's, EVGA AIO's, EVGA CLx AIO
Memory Vengeance
Video Card(s) 4090FE, 4090 MSI Suprim AIO, MSI 4090 X Tri(will edit)'s, EVGA 3080 FTW3's
Storage WD HD's, WD NVME's
Display(s) You would ask...LoL 7 Colour Monitors all given to me for free
Case Rosewell Mining Cases, Coolermaster Haf Evo's
Power Supply EVGA 1600P2's, Seasonic Platinum 1300's, EVGA Platinum 750's, EVGA Gold 750's, EVGA Bronze 650's
Mouse Logitech's
Keyboard Logitech's, Old school ps2's, (will edit)
Software Win 10 Pro's, AI Suite 3, Precison X1's, Afterburner's, AVG Free, Bionic's, FAH's
Benchmark Scores HuH?
I feel I need to make some kind of comment on this issue.
It scares me when I think of all the 12pins running 24/7 in my basement.
I watched Jayztwocents and Gamers Nexus when they were detailing the pin failures on the 4090's and
I determined that once I install the OEM 12 pin connector into my 4090(s) they would stay there.
If I needed to disconnect, it would be at the 4x8pins.
I was actually surprised to see a new 1300 Seasonic PS with a straight 12pin cable at both ends (would this increase the risk?)
How can I leave the cable connected to the power supply and 4090..forever?...lol
I would of clicked a like on just about everyone's post but to be honest I don't feel qualified to agree or disagree with some of what you guys are saying....sorry...
If you feel you deserve or need a response from me please PM me and explain the point you are trying to make...KIDDING! ;)
It is my ignorance and I apologize if you feel left out..It isn't you..it is me!
I feel that if you use the adapter that comes with the card and use it correctly you stand a better chance of restitution if it fails than if you use a third parties 12pin. And knowing that the 12 has a maximum number of disconnects increases it's failure rate if it is 12 pin at both ends.
IMHO...
They could of easily designed a larger 12 (or whatever) Pin that would have been much better suited. One that would have been closer to dummy proof.
:)

this is how it should look
View attachment 384312
the user crossed them over and under almost like braided, jumped on top and zip tied them in that position. yeah, the third party should take that into account too, I guess

I was going to agree that a new cable, not abused used cable, is the way to go but I still don't like both ends to be 12pin....It is the weak point in the system and you now have two weak points, instead of one. I am nervous about the use of the 12pin..sorry.
 
Last edited:
Joined
Feb 18, 2005
Messages
6,049 (0.83/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) Dell S3221QS(A) (32" 38x21 60Hz) + 2x AOC Q32E2N (32" 25x14 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G604
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
this is how it should look
View attachment 384312
the user crossed them over and under almost like braided, jumped on top and zip tied them in that position. yeah, the third party should take that into account too, I guess
This is another reason why I hate this stupid braided/individually sleeved cable nonsense, it ends up causing more of a rat's nest as opposed to what we used to have, which was one sleeve for all the wires which was overall thinner and easier to route.
 
Top