Tuesday, September 24th 2024
NVIDIA RTX 5090 "Blackwell" Could Feature Two 16-pin Power Connectors
NVIDIA CEO Jensen Huang never misses an opportunity to remind us that Moore's Law is cooked, and that future generations of logic hardware will only get larger and hotter, or hungrier for power. NVIDIA's next generation "Blackwell" graphics architecture promises to bring certain architecture-level performance/Watt improvements, coupled with the node-level performance/Watt improvements from the switch to the TSMC 4NP (4 nm-class) node. Even so, the GeForce RTX 5090, or the part that succeeds the current RTX 4090, will be a power hungry GPU, with rumors suggesting the need for two 16-pin power inputs.
TweakTown reports that the RTX 5090 could come with two 16-pin power connectors, which should give the card the theoretical ability to pull 1200 W (continuous). This doesn't mean that the GPU's total graphics power (TGP) is 1200 W, but a number close to or greater than 600 W, which calls for two of these connectors. Even if the TGP is exactly 600 W, NVIDIA would want to deploy two inputs, to spread the load among two connectors, and improve physical resilience of the connector. It's likely that both connectors will have 600 W input capability, so end-users don't mix up connectors should one of them be 600 W and the other keyed to 150 W or 300 W.Above is a quick Photoshop job by TweakTown of how such a card could look like. The requirement of two 16-pin connectors should rule out older PSU types, and NVIDIA will likely only include one adapter that converts two or three 8-pin PCIe power connectors to a 16-pin, with the other input expected to be a native 600 W input from an ATX 3.0 or ATX 3.1 PSU. Most of the newer generation ATX 3.0 or ATX 3.1 PSUs in the market only have one native 16-pin connector, and three or four additional 8-pin PCIe power connectors. As for the connector itself, this could very likely be a 12V-2x6 with compatibility for 12VHPWR.
Some PSU manufacturers are beginning to release high-Wattage models with two native 12V-2x6 connectors. These would typically have a Wattage of over 1300 W. The Seasonic Prime PX-2200 W, released earlier this week, is an extreme example of this trend. Besides its high Wattage, this PSU puts out as many as four 12V-2x6 connectors. Another recent example would be the MSI MEG AI1600T PCIE5 (1600 W), with two native 600 W 12V-2x6.
Source:
TweakTown
TweakTown reports that the RTX 5090 could come with two 16-pin power connectors, which should give the card the theoretical ability to pull 1200 W (continuous). This doesn't mean that the GPU's total graphics power (TGP) is 1200 W, but a number close to or greater than 600 W, which calls for two of these connectors. Even if the TGP is exactly 600 W, NVIDIA would want to deploy two inputs, to spread the load among two connectors, and improve physical resilience of the connector. It's likely that both connectors will have 600 W input capability, so end-users don't mix up connectors should one of them be 600 W and the other keyed to 150 W or 300 W.Above is a quick Photoshop job by TweakTown of how such a card could look like. The requirement of two 16-pin connectors should rule out older PSU types, and NVIDIA will likely only include one adapter that converts two or three 8-pin PCIe power connectors to a 16-pin, with the other input expected to be a native 600 W input from an ATX 3.0 or ATX 3.1 PSU. Most of the newer generation ATX 3.0 or ATX 3.1 PSUs in the market only have one native 16-pin connector, and three or four additional 8-pin PCIe power connectors. As for the connector itself, this could very likely be a 12V-2x6 with compatibility for 12VHPWR.
Some PSU manufacturers are beginning to release high-Wattage models with two native 12V-2x6 connectors. These would typically have a Wattage of over 1300 W. The Seasonic Prime PX-2200 W, released earlier this week, is an extreme example of this trend. Besides its high Wattage, this PSU puts out as many as four 12V-2x6 connectors. Another recent example would be the MSI MEG AI1600T PCIE5 (1600 W), with two native 600 W 12V-2x6.
110 Comments on NVIDIA RTX 5090 "Blackwell" Could Feature Two 16-pin Power Connectors
igoring for a moment no one gets forced to buy stuff, and besides, if you have +1k for a single part, another 10 bucks on your power bill wont matter.
In fast pace evolving world aplicability of PCs and particulary GPUs also changing at fast pace.
Gaming or not, stuff becomes more expensive, more moronic in dimensions, and more power hungry. Ada is two years old at this point and there's still no 30 W option. One can't build an HTPC that supports DLSS/RT (well, to be realistic, RT doesn't matter on such machines), or at least has next-gen dedicated graphics inside. GTX 1030 is horribly weak and terribly old. RX 6400 is also far from ideal.
I spose in a class full of children I'm better off trying to fit in
I really hope this breaks Nvidia when Gamers just plain up and says no. When they get lack sales and no one wants it they will finally realized their mistakes and do a redesign. This is what happened before with Nvidia. Remember they had the Geforce series up to the last card 9800. Then they redesigned their cards and chips and the GTX series came out. Low power design and higher GPU output. This was a good idea smart thinking. The GTX ended with the 1000 series cards which I still own. And still keeps up with current games today. Now we are in the RTX series up to the 5000 now. I wonder when it will break.
The maximum power an entire home can use is limited by the transformer out on the pole, so really it depends on the rating of the transformer and the number of homes being fed by it.
I think 3-phase is limited to commercial and industrial sites here.
Don't quote me on any of this, it's been awhile.
Before some of it gets the capacity to decide it could do better without physical humans, which inefficiently took up space and atoms that could have been more compute, and make it so. I wish I was joking. With luck it would still run things like virtual people on those ex-humanity hardware, and that would be one of the better outcomes.
But now the tides are completely turned - even so much that Nvidia is pumping other sectors with income from AI - there is absolutely no other explanation for Gaming to be raking in record growth at the end of life cycle of a generation. Not shown in that graph:
Kinda funny when everyone talks about saving the planet etc. :laugh:
ignoring those in forums like this, or the few that saved for "years" to buy a big chip to keep for years.
Stunts like Titan, 3090, 4090, and now this monster are meant for work stations with XEON/EPYC and slap four of the fuckers in the box. Most PC gaming is actually on xx60 or lower GPUs. That's always been the case.
The real progress in the GPU industry isn't how ridiculously expensive and overkill you can make something. Running a game at twice the resolution and four times the framerate of the developer's intent doesn't really make the game that much better, and a diminishing number of customers are willing to pay an increasingly-insulting amount of money for that privilege - making the flagship GPU market something that will be taken even less seriously by game devs than it already is. They want their game to run on as many PCs and consoles as possible, which means 300GB game installs requiring a $3000 PC to run at acceptable framerates just aren't going happen.
Meanwhile, mobile-first hardware from Qualcomm and Apple is encroaching into the traditional desktop, replacing traditional architectures and legacy silicon designs with ridiculously-optimised hardware that can operate entire systems at comparable performance to a desktop in the same power budget as a motherboard chipset in a traditional desktop. They can game right now in many popular titles and it won't be long before these ultra-efficient, low-power devices are the default hardware spec that developers are targeting. Sure, your 1200W GPU has 48GB of VRAM and can render at 8K120 but the game is designed to run on a hypothetical Nintendo Switch 2 and plays best on a Nintendo Switch 2 using batteries and a 40mm fan as the sole cooling for the entire system. Cloud is increasingly irrelevant as the shift to mobile devices continues and the dependence on a stable, uncontested, low-latency internet connection cannot be guaranteed.
Latency and input lag kills fun, especially inconsistent latency and input lag - and games only get played if they're fun, IME.
Almost guaranteed that if you lower your 3060/3060ti clocks to A2000 levels you will obtain similar efficiency figures, minus the inconvenience of the additional PCIe power cable.
Meanwhile, if I wanted 48GB of VRAM at sub-3090 performance, I wouldn't have other choices than spending 5000$ on the A6000, or getting an additional GPU.