Thursday, February 3rd 2022

Upcoming PCIe 12VHPWR Connectors Compatible with NVIDIA RTX 30-series Founders Edition Graphics Cards

As you're most likely aware of, NVIDIA introduced a new power connector with its RTX 30-series Founders Edition graphics cards and at the time it was something of a controversy, especially as none of its AIB partners went for the connector. As it turned out, the connector was largely accepted by the PCI-SIG, with a few additions which lead to the 12VHPWR connector. The main difference between the two was the addition of a small set of sense connectors, for a 12+4-pin type connector. It has now been confirmed that the 12VHPWR will work with NVIDIA's Founders Edition cards, although this isn't a huge surprise as such, but rather good news for those that happen to own a Founders Edition card and are looking to invest in a new PSU.

However, what's more interesting in the news about the 12VHPWR connector is that it will operate in two distinct modes. If the 4-pin sense connector isn't connected to the GPU, the PSU will only deliver 450 Watts to the GPU, presumably as some kind of safety precaution. On the other hand, if the sense connector is used, the same cable can deliver up to 600 Watts, which would allow for a combined card power draw of up to 675 Watts for next generation GPUs. It's possible that we'll see cards with multiple power thresholds that will be negotiated on the fly with the PSU and we might also see PSU's that can force a lower power state of the GPU in case the overall system load gets too high. It'll be interesting to see what the new standard delivers, since so far not a lot of details have been released with regards to how the sense function works in detail.
Sources: VideoCardz, via HardwareLuxx
Add your own comment

36 Comments on Upcoming PCIe 12VHPWR Connectors Compatible with NVIDIA RTX 30-series Founders Edition Graphics Cards

#1
DeathtoGnomes
that sense connector seems like a way to limit performance. 2 identical NextGen (40xx) cards one with this connector, one without, the one without is significantly lower clocks and limited overclocking.
Posted on Reply
#2
TheLostSwede
News Editor
DeathtoGnomesthat sense connector seems like a way to limit performance. 2 identical NextGen (40xx) cards one with this connector, one without, the one without is significantly lower clocks and limited overclocking.
As this new sense connector is part of the PCI-SIG spec, I guess we won't be seeing any new cards without it.
Posted on Reply
#3
DeathtoGnomes
TheLostSwedeAs this new sense connector is part of the PCI-SIG spec, I guess we won't be seeing any new cards without it.
I'll stick with my magic 8-ball on this one.
Posted on Reply
#4
TheLostSwede
News Editor
DeathtoGnomesI'll stick with my magic 8-ball on this one.
I think you might need to install a firmware update...
Posted on Reply
#5
Prima.Vera
675W for a graphics card is already ridiculously stupid. We need 3nm GPUs as soon as possible.
Posted on Reply
#6
Chomiq
Moar powah seems like the wrong way to go.
Posted on Reply
#7
TheLostSwede
News Editor
Prima.Vera675W for a graphics card is already ridiculously stupid. We need 3nm GPUs as soon as possible.
It's the upper limit, not lower...
Posted on Reply
#8
Prima.Vera
TheLostSwedeIt's the upper limit, not lower...
I know. But the thought that there can be video cards that can suck more than 600W is beyond stupid.
Posted on Reply
#9
Unregistered
I can't see a GPU with a air cooler capable of 600w, can't even do that with a CPU.
#10
TheLostSwede
News Editor
Prima.VeraI know. But the thought that there can be video cards that can suck more than 600W is beyond stupid.
Maybe we should ask the game developers to write better code?
Posted on Reply
#11
ratirt
TheLostSwedeMaybe we should ask the game developers to write better code?
Is it only the code that matters here?

New connector, probably, new possibilities.
Posted on Reply
#12
TheLostSwede
News Editor
ratirtIs it only the code that matters here?

New connector, probably, new possibilities.
Well, obviously games are getting more advanced, but no-one really asked for RT support, yet we got it and it seems to be part of the reason why new cards need even more power.
Posted on Reply
#13
ratirt
TheLostSwedeWell, obviously games are getting more advanced, but no-one really asked for RT support, yet we got it and it seems to be part of the reason why new cards need even more power.
Oh. The RT you meant. Either way, I dont think game coding can mitigate in a large number the needs of a hardware, thus the power consumption that goes with it. The GPU chip arch has more influence on that matter.
Posted on Reply
#14
TheLostSwede
News Editor
ratirtOh. The RT you meant. Either way, I dont think game coding can mitigate in a large number the needs of a hardware, thus the power consumption that goes with it. The GPU chip arch has more influence on that matter.
It was just one example. Some games seem to be poorly coded these days as well, as they rely more and more on the hardware people have and can as such make "sloppier" code.
Due to this, we need more and more powerful hardware all the time. All the bad console ports are a great example of this, as many of them run great on console, but suck on PC.
Posted on Reply
#15
ratirt
TheLostSwedeIt was just one example. Some games seem to be poorly coded these days as well, as they rely more and more on the hardware people have and can as such make "sloppier" code.
Due to this, we need more and more powerful hardware all the time. All the bad console ports are a great example of this, as many of them run great on console, but suck on PC.
I'm not saying no to this though. I see it, that the games (poorly coded) just works slower using more resources. Utilization of the resources is poor and in order to achieve better performance, the card needs to run at its maximum. It does correlate to a power draw nonetheless.
Posted on Reply
#16
bonehead123
geesh...600w + for da GPU, plus whatever the CPU & other components need, if this trend of moar increased powah continues much longer, we will all need to install a small warp core and Cryochamber chiller next to the house just to use a friggin newish 'putin box, hahaha :)
Posted on Reply
#18
DeathtoGnomes
TheLostSwedeMaybe we should ask the game developers to write better code?
While true, it shows how experienced developers are or are not. The bigger hit is those same developers not able to integrate APIs optimally, and sometimes the API needs to be edited to 'fit in'. Its quite obvious most developers dont know how to optimize and, if rumors are true, many use a third party to do so.
Posted on Reply
#19
Divide Overflow
Will NVIDIA be claiming royalties on all modern power supplies now? :rolleyes:
Posted on Reply
#20
TheLostSwede
News Editor
Divide OverflowWill NVIDIA be claiming royalties on all modern power supplies now? :rolleyes:
This has nothing to do with them.
Posted on Reply
#21
Juventas
I just installed a FE card with the 12-pin connector in a computer with a ATX12VO power supply. I can't imagine there's many out there with this combo right now.

If a power supply came with a 12VHPWR connector, could it be adapted to the old 6/8-pin? If so, I don't see any reason to not make this the standard.
Posted on Reply
#22
Assimilator
Correlation does not imply causation, children. Just because the 12VHPWR connector can provide up to 600W of power, does not mean that next-gen GPUs will consume that much power. Engineers always build in extra capacity when designing new power connectors, otherwise said connectors would be obsolete within a year's time, which would entirely negate the point of standardising those connectors.

The RTX 3080 Ti draws a maximum of 359W which means its 12-pin connector provides under 300W of power, which is half the proposed 12VHPWR. If you really think the RTX 4080 Ti is going to somehow magically consume double the power, when no graphics card generation-on-generation has ever done that, then you're not thinking; you're a moron.
Posted on Reply
#23
eidairaman1
The Exiled Airman
Divide OverflowWill NVIDIA be claiming royalties on all modern power supplies now? :rolleyes:
Trying to be proprietary along with intel, f em both
Posted on Reply
#24
Assimilator
eidairaman1Trying to be proprietary along with intel, f em both
I suggest you open a dictionary and look up the definition of "proprietary", because it does not mean what you think it means.
Posted on Reply
#25
walker15130
I see absolutely no point in having those extra sensing pins. I wonder if they will actually end up implemented in new GPUs. So far every piece on information related to them shown no reason other than "the drawing/standard says so" and "they are just connected to ground". The PSU will not change the way it delivers power to the gpu because that's not how PSU works. Realistically they can only be implemented in GPU firmware and this adds no value to the product.

I just hate the idea. You have neat bundle of 12 wires and then there is this lower gauge bullshit serving ...grounds just tacked on:( If anyone has better idea then I'd like to hear it.
Posted on Reply
Add your own comment
Dec 22nd, 2024 02:54 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts