• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Upcoming PCIe 12VHPWR Connectors Compatible with NVIDIA RTX 30-series Founders Edition Graphics Cards

I see absolutely no point in having those extra sensing pins. I wonder if they will actually end up implemented in new GPUs. So far every piece on information related to them shown no reason other than "the drawing/standard says so" and "they are just connected to ground". The PSU will not change the way it delivers power to the gpu because that's not how PSU works. Realistically they can only be implemented in GPU firmware and this adds no value to the product.

I just hate the idea. You have neat bundle of 12 wires and then there is this lower gauge bullshit serving ...grounds just tacked on:( If anyone has better idea then I'd like to hear it.
 
Last edited:
Guys, keep in mind my 3090 with < 300W power draw (undervolted) managed to melt 2x8 PCI-E extension cables


They need to be over-specced for safety with wattages like this
 
Oh, he raised a fair point: is the reason the 3090Ti was delayed related to the new connector??
 
The linked Videocards article reveals enough information so we don't have to guess so much. 12+4 pins are (or can be) all integrated in one connector with one cable, quite similar to the way 20+4-pin motherboard connector is made.

Those 4 pins would be useful as voltage sense for both ground and +12V. Sense wires carry no current, they're there to detect voltage drops in wires and connectors that do carry high currents. Every manufacturer will be trying to run high currents through thin wires and flimsy contact points, just to save a few cents, so sense wires, uhm, do make sense. However, it seems that they will be used for signaling (high power allowed, voltage stable, cable present). The SENSE0 might also be a true ground sense but this is not clear.
 
Wish i could speak German, the video looked interesting. The new connector makes sense though i guess otherwise what would be next, 4x 8pin connectors?
 
675W for a graphics card is already ridiculously stupid. We need 3nm GPUs as soon as possible.

While many of us would believe that greater efficiency gains would practically translate to less power usage, this is actually not the case in basically every application. There's a well documented phenomenon called the "rebound effect", and it has basically been the reality with respect to technological efficiency gains. What has been observed is that greater efficiency DOESN'T lead to less consumption, actually the opposite, and we can see this now with GPUs, where they have the means to be more efficient than ever, and yet they're steadily consuming more power than ever, and rumors say the next generation will be even worse.

A great hypothetical for demonstrating the rebound effect is the work we do as people. Even though automation, better hardware, faster computers, etc have made workers more efficient than ever, does this ACTUALLY translate to working less? No, it never does. This is predominantly attributed to the capitalist model of perpetual economic expansion, where instead of maintaining output at its current levels and using efficiency gains to work less, we work the same amount and instead increase output. When output is increased, the economy grows and more output is desired, and that therefore translates to even more consumption and more output, so the efficiency gains actually result in increased consumption.

This is why the transition to 3nm is likely to result in even more power hungry GPUs. The fact that the "videocard" is a complete product and comes with its own cooling solution also ensures this trend, and videocard manufactures can use ever larger and elaborate cooling solutions to buttress the trend of using more power. In fact, I think it would be possible to actually empirically graph an inverse relationship in the decrease of GPU nodes with an increase in average videocard thickness... It wasn't so long ago that 3+ slot GPU coolers were an abberation rather than the norm. With CPUs, it's slightly different as the manufacturer cannot guarantee a specific level of cooling capability since it's left to the consumer/end user, though this has not hindered Intel in increasing power consumption. AMD on the other hand, puts forth a greater effort at keeping power consumption steady while increasing performance, but even this still does not achieve the goal of have a NET reduction in power consumption. Under our current economic system though, especially the paramount importance of short-term shareholder returns over all other consideration,s, especially long-term considerations, net reduction in power consumption will never be the goal.

As long as consumers do not care about efficiency as a marketing point, more performance at greater power consumption will be the ongoing trend. I honestly believe that the 650w-850w PSU being the typical wattage in the vast majority of DIY builds will sooner than later switch to 1000w-1200w.
What's funny is that I can distinctly remember that when Maxwell was the current Nvidia generation, online Nvidia advocates would ceaselessly brag about efficiency to the extent that it was one of the main points in any online argument about the "best GPUs", and with each subsequent generation released by Nvidia, effeciency as a salient point has dwindled into nothingness. Anyway, this is why future node shrinks and the capability of increased efficiency will not actually result in less power consumption, but more. The Rebound Effect is also why the overwhelming majority of leading thinkers on the topic of climate change, the environmental crisis, and the future of our species have concluded that technology by itself and technological "progress" will not solve any of these pressing issues.
 
Last edited:
There's also the psychological point of view: who would spend $£1300€ for a 100-watt card that only needs a toy cooler, regardless of performance?
 
However, what's more interesting in the news about the 12VHPWR connector is that it will operate in two distinct modes. If the 4-pin sense connector isn't connected to the GPU, the PSU will only deliver 450 Watts to the GPU, presumably as some kind of safety precaution. On the other hand, if the sense connector is used, the same cable can deliver up to 600 Watts, which would allow for a combined card power draw of up to 675 Watts for next generation GPUs.

*so, is there will be separated cabels or just combined into 1 united cables ??
 
While many of us would believe that greater efficiency gains would practically translate to less power usage, this is actually not the case in basically every application. There's a well documented phenomenon called the "rebound effect", and it has basically been the reality with respect to technological efficiency gains. What has been observed is that greater efficiency DOESN'T lead to less consumption, actually the opposite, and we can see this now with GPUs, where they have the means to be more efficient than ever, and yet they're steadily consuming more power than ever, and rumors say the next generation will be even worse.
There's a much simpler explanation: the end of Dennard scaling. Power per transistor isn't decreasing at the same rate as area per transistor with each node shrink. So, as long you want more than miniscule gains in performance with each node shrink, you have to increase power.
 
Hey, i found this discussion and it could help me because i bought a ROG LOKI for my new build, so with this " new " 12VHPWR connector, and i was wondering if i can use this cable ( 16 pin to 16 pin ) because i have a RTX 3080 founder and the tiny 4 pins connector will not be plug into the gpu but it will into the psu, is this a problem, can i use it like this ?
 
Hey, i found this discussion and it could help me because i bought a ROG LOKI for my new build, so with this " new " 12VHPWR connector, and i was wondering if i can use this cable ( 16 pin to 16 pin ) because i have a RTX 3080 founder and the tiny 4 pins connector will not be plug into the gpu but it will into the psu, is this a problem, can i use it like this ?
I would not use any cable not officially supported with your GPU. That may work fine if those pins are for 40 series or it may not - we can't answer that here.
 
Back
Top