Tuesday, December 26th 2023

Seasonic Unveils 12V-2x6 (12VHPWR Successor) Cable, Invites Beta Testing

Seasonic is ready with a modular PSU cable for the new 12V-2x6 standard, which succeeds 12VHPWR. Intel felt the need to introduce the new cable standard as part of the new ATX 3.1 specification as the 12VHPWR is flimsy, and poses a burn-out hazard in case of an improper contact. For modular PSU manufacturers, implementing the new connector is as easy as selling or giving away a modular cable that plugs into the 12VHPWR connector on the PSU's backplane, and puts out a 12V-2x6 connector on the other end, and so as a leading power supply OEM, Seasonic is handing out such cables for free, but there's a catch.

Seasonic is giving away a modular PSU cable that plugs into a 12VHPWR on the PSU's backplate on one end, and puts out a new ATX 3.1 standards compliant 12V-2x6 connector on the other end, which promises greater mechanical stability over the 12VHPWR. The cable is free, however, it requires users to sign up to a "Beta Tester Program" by Seasonic. We haven't read the terms and conditions of the program, but we predict it to be a form of waiver against any damage liability—you use the cable at your own risk. The cable features 16-AWG wires, and a connector that's aligned at a 90° angle. The capable is capable of 600 W of continuous output and excursions within the ATX 3.1 and PCIe CEM 5.1 specifications.
Source: Wccftech
Add your own comment

24 Comments on Seasonic Unveils 12V-2x6 (12VHPWR Successor) Cable, Invites Beta Testing

#1
Chaitanya
Since beta testing and victim blaming of GPU maker wasnt enough now we have PSU maker calling for more Beta testing.
Posted on Reply
#2
Crackong
Beta testing a $20 cable on the risk of burning your $2000 GPU.
Posted on Reply
#3
HisDivineOrder
This whole mess with the cabling for the 4090 should be a red flag to everyone that it's just way too much power to be putting through a consumer card and Nvidia should rethink their design choices.
Posted on Reply
#4
Dr. Dro
HisDivineOrderThis whole mess with the cabling for the 4090 should be a red flag to everyone that it's just way too much power to be putting through a consumer card and Nvidia should rethink their design choices.
I disagree. They could very well have limited the RTX 4090 to ~350 W - and as far as performance is concerned, it'd still obliterate the 7900 XTX.

The true reason for such high power allowances is to correct the critical design fault that plagued the vanilla RTX 3090, especially if you had a standard power model such as the ASUS TUF OC that I had. Clocks were crap, and the card throttled to the high heavens if you dared run any heavy raytracing or compute workload on it, I often recall my personal 3090 going as low as 1200 MHz during 4K RT workloads, while constantly hitting the 375 W power limit (@ full 107% limit). This strict limit coupled with a rudimentary first generation GDDR6X implementation that had no less than 24 memory chips chugging at up to ~130 W for memory only were/are a severe fault of that specific graphics card. It murdered performance. There is no other way: you either let the processor chug power, or you throttle the performance, and not even the massive jump from Samsung 8N to TSMC's custom N4 node, coupled with the architectural improvements and careful planning of each and every SKU's configuration would make up for that.

For the RTX 4090, Nvidia opted to implement the improvements that they had done to the RTX 3090 Ti design: double-density second generation 21 Gbps G6X memory chips and improved power delivery mechanism were implemented on this SKU, while giving these cards such a high power allowance solved the aforementioned problem and enabled them to sell poorly binned AD102 dies that have an insane amount of disabled compute units and cache slices (12.5% of compute and 25% of cache, a hypothetical 4090 Super or Titan Ada with the full AD102 and the new memory chips the 4080 has would lay down the law on the original 4090 and that card as a product is already so absurdly powerful that it attracted the attention of regulatory agencies, it's literally faster than what the government is comfortable with you having access to, keep that in mind!)

With the RTX 4080, they actually went a further step ahead and took a page out of AMD's own playbook, following the same formula as they went for with the RX 6500 XT: they took a smaller, power efficient AD103 die and gave it almost similarly ludicrous power limits that the card, under normal operation, will never realistically require, even with the consideration for raytracing/compute heavy workloads that I brought up earlier: that allowed it to stretch clocks to the absolute maximum (3 GHz+ on a 4080 is cake, my ROG Strix does it with the fans off on a cool day), all the while they also took the time to introduce the third-generation 24 Gbps G6X chips that I mentioned earlier with it. This is why the 4080, despite being a 256-bit card, doesn't have an iso 33% reduction in memory bandwidth when compared to the 4090, the bus may have gone from 384 to 256, but the clocks were also up, alongside bandwidth and further reducing latency, the result is that while the 4080 is obviously a card that should have been a 4070 at best, it's also one that will punch way above its intended weight (it will basically never throttle and misbehave on you because it's starved of power or resources) and that's why it's sold as a high-end card. It ended being a balanced configuration with good utilization of hardware resources: power efficiency be damned, but who cares, it's still under 250 W most of the time even if you're abusing it with say, some 4K RT Metro Exodus.

I can't fault Nvidia's genius here, they know how to make a good GPU, and above that, they know how to make a good product out of "garbage", which is really what the 4090 is all about.
CrackongBeta testing a $20 cable on the risk of burning your $2000 GPU.
Is this not what being a tech enthusiast is all about? Trailblazing often involves an actual blaze :D
Posted on Reply
#5
Macro Device
Dr. DroThis strict limit coupled with a rudimentary first generation GDDR6X implementation that had no less than 24 memory chips chugging at up to ~130 W for memory only were/are a severe fault of that specific graphics card. It murdered performance. There is no other way
I'd try undervolting VRAM, is it possible on Ampere?
Posted on Reply
#6
ZoneDymo
if even Seasonic fails to make something not look jank af...you know we are in trouble

and I said it before and Ill say it again, screw this connector nonsense, up the damn PCI-E standard so we can finally move on from the more then two decades old 75 watts limit to, idk, 300 watts?
Posted on Reply
#7
Dr. Dro
Beginner Micro DeviceI'd try undervolting VRAM, is it possible on Ampere?
Don't think that's possible or it'd be one very popular "mitigation" for the 3090. People just rushed to do bizarre pad mods or cover their cards in heat sinks, was very popular when people were pushing them even further to mine on the poor things:

Posted on Reply
#8
R0H1T
Dr. DroIs this not what being a tech enthusiast is all about? Trailblazing often involves an actual blaze :D
Yes yes & while we're at it why don't we put our heads in a microwave & put it on roast? Like that Loki episode :ohwell:
Posted on Reply
#9
Dr. Dro
R0H1TYes yes & while we're at it why don't we put our heads in a microwave & put it on roast? Like that Loki episode :ohwell:
What I had in mind was closer to this :D

Posted on Reply
#10
TheDeeGee
One thing left to do.

Tell GPU manufacturers to all use the same connectors on the GPU. As right now some of them are flipped 180.
Posted on Reply
#11
SL2
Dr. DroThey could very well have limited the RTX 4090 to ~350 W - and as far as performance is concerned, it'd still obliterate the 7900 XTX.
No one ever expected those two to be comparable. Hint: price & transistor count.
Posted on Reply
#12
Zforgetaboutit
It's my understanding this is progress at the GPU end of the cable.

The "refreshed" nVidia GPUs may have ATX 3.1 PCB changes. Seasonic's existing PSU PCBs don't.

And so, I can't call this setup 100% "worry free", but it's appreciated as a bigger step in the right direction, even if, for the present, it's in beta testing.

Kudos to Seasonic.
Posted on Reply
#13
SL2
ZforgetaboutitThe "refreshed" nVidia GPUs may have ATX 3.1 PCB changes.
Why do you say that?
Posted on Reply
#16
bonehead123
"Absolute power corrupts (fries) absolutely" - Lord Acton (well sorta)
Posted on Reply
#17
Wirko
"We will send you a free fire extinguisher with each order if you so choose. You must accept that the fire extinguisher is beta, too."
Posted on Reply
#18
Assimilator
SL2No one ever expected those two to be comparable. Hint: price & transistor count.
AMD certainly did when they named their card "7900 XTX".
Posted on Reply
#19
SL2
AssimilatorAMD certainly did when they named their card "7900 XTX".
You're conflating marketing with expectations.
Posted on Reply
#20
Crackong
Dr. DroIs this not what being a tech enthusiast is all about? Trailblazing often involves an actual blaze :D
It is kinda sad seeing Cablemod willing to give warranty for issues caused by their 90 degree adaptors.
While giant PSU manufacturers like Seasonic wants you to sign an agreement so they don't need to beta test their own cables.
Posted on Reply
#21
Dr. Dro
CrackongIt is kinda sad seeing Cablemod willing to give warranty for issues caused by their 90 degree adaptors.
While giant PSU manufacturers like Seasonic wants you to sign an agreement so they don't need to beta test their own cables.
As I understand Cablemod was selling a finished product, built to PCI-SIG's spec, and thus bound to stand by their products, legally or otherwise, while this is a pre-release cable and looks like Seasonic is footing the bill and providing it to interested parties for free. If you willingly request this pre-release sample, which you're not obligated to do, it's only fair that you sign a waiver in case things go south. Which is pretty dam unlikely mind you, I think these cables are probably bullet proof by now, if I had a Seasonic power supply I would request one myself, even if to keep as a spare.
Posted on Reply
#22
Crackong
Dr. DroAs I understand Cablemod was selling a finished product, built to PCI-SIG's spec, and thus bound to stand by their products, legally or otherwise, while this is a pre-release cable and looks like Seasonic is footing the bill and providing it to interested parties for free. If you willingly request this pre-release sample, which you're not obligated to do, it's only fair that you sign a waiver in case things go south. Which is pretty dam unlikely mind you, I think these cables are probably bullet proof by now, if I had a Seasonic power supply I would request one myself, even if to keep as a spare.
I think there is contradiction here.

If the cable is bullet proof by now, Seasonic wouldn't have to provide pre-release samples for testing and feedback with an agreement attached.
They could just put a tag on it and start selling.

If the cable isn't ready and required testing and feedback, Seasonic should do their own testing:
1. With their own 4090s
2. In their own labs
Posted on Reply
#23
Legacy-ZA
Man, they really know how to double down on patents that are not working don't they, and now you have to pay to BETA test their junk too? :banghead:

Clown world. :roll:
Posted on Reply
#24
shaolin95
HisDivineOrderThis whole mess with the cabling for the 4090 should be a red flag to everyone that it's just way too much power to be putting through a consumer card and Nvidia should rethink their design choices.
if it was made by ATI you would have an altar for the 4090... gotta love fanboys
Posted on Reply
Add your own comment
Nov 19th, 2024 07:46 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts