- Joined
- Feb 20, 2019
- Messages
- 8,775 (4.01/day)
System Name | Bragging Rights |
---|---|
Processor | Atom Z3735F 1.33GHz |
Motherboard | It has no markings but it's green |
Cooling | No, it's a 2.2W processor |
Memory | 2GB DDR3L-1333 |
Video Card(s) | Gen7 Intel HD (4EU @ 311MHz) |
Storage | 32GB eMMC and 128GB Sandisk Extreme U3 |
Display(s) | 10" IPS 1280x800 60Hz |
Case | Veddha T2 |
Audio Device(s) | Apparently, yes |
Power Supply | Samsung 18W 5V fast-charger |
Mouse | MX Anywhere 2 |
Keyboard | Logitech MX Keys (not Cherry MX at all) |
VR HMD | Samsung Oddyssey, not that I'd plug it into this though.... |
Software | W10 21H1, barely |
Benchmark Scores | I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000. |
600W at 12V is a total of 50 Amps. A single cable the correct gauge to handle 50A is quite chunky and inflexible - it would need to be 5mm in cross-section. Even if a single pair cable was made with 50A wires and big, strong, chunky connectors, it would be so stiff and hard to bend towards the graphics card's connector that it'd likely just rip the connector off the board at the solder joint.Why's that?
Splitting the 50A down 6 wire pairs makes it 8.3A per wire, which means you can use much thinner wires and connectors that are easier to manipulate and their bend radius would exert much less force on the connectors. The alarming thing is that the small connectors and AWG16 wire used for 12VHPWR/12V-6X2 are only rated to 9.5A and they're getting up to 8.3A by default, with no other factors considered. The older PCIe 8-pin or 6+2-pin connectors are rated to 13A per wire and only carry 4.2A by default, so there's a huge difference in the safety margin.
What you gain by using more pairs of smaller wires is a cable that's easier to use and far more practical, but the risks are that the current isn't distributed evenly down all of the wire pairs. Using shunt resistors to monitor current on the 12V wire pairs with allows the VRM controllers to load-balance all of the wires, but the GPU designs that have caused melted cables haven't done this, so all 600W of power (50A) is going over fewer wires, or even just one wire, which vastly exceeds that max current rating and gets them hot enough to melt/ignite the plastic connector and wire sheathing.
Edit:
If it hasn't been linked already, this video explains it all rather well:
Last edited: